# NIS2 Gap Assessment

> Perform a structured gap assessment against all NIS2 Directive Art. 21 cybersecurity risk-management measures. Rates maturity (0-5) per measure, detects missing compliance artifacts, and builds a prioritized remediation roadmap weighted by regulatory severity and entity classification. Produces a comprehensive gap report with interactive dashboard data.



Tags: NIS2, Gap Assessment, Compliance, Cybersecurity, GRC, Audit Readiness


## Example Prompts

- Run a NIS2 gap assessment for our organization
- Assess our maturity against all NIS2 Art. 21 measures
- Create a NIS2 remediation roadmap with priorities
- Check which NIS2 compliance artifacts we are missing

URL: https://rakenne.app/skills/nis2-gap-assessment/index.md

Try this skill: https://rakenne.app/a/?skill=nis2-gap-assessment



## Overview

Perform a structured gap assessment against all 11 cybersecurity risk-management measures mandated by the NIS2 Directive (EU) 2022/2555, Article 21(2)(a)-(k). This is the capstone NIS2 skill — it evaluates organizational maturity across every mandatory measure, detects missing compliance artifacts, and produces a prioritized remediation roadmap weighted by regulatory severity and entity classification (essential vs. important).

The skill uses a 0-5 maturity model (Nonexistent to Optimized) to rate each measure, a mandatory artifact detector to scan for 27 expected compliance documents, and a remediation prioritizer that groups findings into four implementation phases based on measure dependencies, enforcement risk, and effort. A dashboard builder then synthesizes everything into a readiness score, maturity heatmap, and remediation timeline.

### What It Covers

- **Art. 21(2)(a)-(k) measures** — measure-by-measure assessment covering risk analysis, incident handling, business continuity, supply chain security, secure development, effectiveness assessment, training, cryptography, access control, MFA, and secure communications
- **Mandatory artifact detection** — scans for 27 expected compliance documents across all measures plus cross-cutting artifacts (entity classification, governance approvals, management training records)
- **Maturity rating** — 0-5 scale with negative-indicator capping that prevents inflated scores when foundational evidence is absent
- **Findings register** — structured gap findings with severity classification (critical, major, moderate, minor, observation) and measure references
- **Remediation roadmap** — four-phase prioritized plan (Foundation, Core Measures, Technical Controls, Validation) weighted by entity type, enforcement penalties, and measure dependencies
- **Dashboard data** — overall readiness score, maturity heatmap, artifact coverage, severity distribution, and phase-by-phase remediation timeline

## Example Conversation

> **You:** We need to run a NIS2 gap assessment. We're a mid-size energy distribution company in Germany, classified as an essential entity. We've never done a formal NIS2 assessment but we have some ISO 27001 controls in place from a partial implementation two years ago.
>
> **Agent:** I'll guide you through a structured NIS2 gap assessment covering all 11 Art. 21(2) measures. Let me start by understanding your current state.
>
> 1. Do you have an existing information security policy and risk analysis framework?
> 2. Is there a documented incident response plan? Have you tested it?
> 3. Do you have business continuity and disaster recovery plans?
> 4. How do you manage supply chain security and third-party risk?
> 5. Who is responsible for cybersecurity governance, and has the management body received training per Art. 20?
>
> **You:** We have an infosec policy from the ISO 27001 work but it hasn't been reviewed since 2024. We have a basic incident response plan — never tested. No formal BCP. Supply chain security is ad-hoc — we assess major vendors informally. Our CISO reports to the CTO. Board hasn't done NIS2-specific training.
>
> **Agent:** Good — the ISO 27001 foundation gives you a head start on several measures. Let me check your workspace for existing NIS2 compliance artifacts. Running `mandatory_artifact_detector`...
>
> **Results — 8/27 artifacts found:**
>
> | Measure | Artifact | Status |
> |---------|----------|--------|
> | (a) | Risk analysis policy | **Stale** (2024-02-15) |
> | (a) | Information security policy | **Stale** (2024-03-01) |
> | (b) | Incident response plan | Found |
> | (b) | Incident reporting procedure | **Missing** |
> | (c) | Business continuity plan | **Missing** |
> | (c) | Backup policy | Found |
> | (c) | Disaster recovery procedures | **Missing** |
> | (d) | Supplier risk register | **Missing** |
> | (d) | Supply chain security policy | **Missing** |
> | (e) | Secure development policy | **Missing** |
> | (e) | Vulnerability management procedure | Found |
> | (f) | Security assessment procedures | Found |
> | (f) | Security effectiveness metrics | **Missing** |
> | (g) | Cybersecurity training plan | **Missing** |
> | (g) | Cyber hygiene guidelines | **Missing** |
> | (h) | Cryptography policy | Found |
> | (i) | HR security policy | **Missing** |
> | (i) | Access control policy | Found |
> | (i) | Asset inventory | Found |
> | (j) | MFA policy | **Missing** |
> | (j) | Authentication standards | **Missing** |
> | (k) | Secure communications policy | **Missing** |
> | Cross | Entity classification report | Found |
> | Cross | Governance approval records | **Missing** |
> | Cross | Management training records | **Missing** |
>
> 19 artifacts are missing or stale. Two critical policies (risk analysis and infosec) exist but haven't been reviewed in over a year. Let me now assess each measure. Starting with Measure (a) — Risk analysis and information security policies. Tell me more about your current risk management practices.
>
> **You:** We did a risk assessment during the ISO 27001 work. We have a risk register in a spreadsheet, but it hasn't been updated since the initial assessment. No formal review cycle. The infosec policy exists but was never formally communicated to all staff.
>
> **Agent:** Let me continue through all measures and then rate them in batch. After collecting your input on measures (b) through (k), I'll run the maturity rater.
>
> *[After collecting input on all 11 measures...]*
>
> Running `measure_maturity_rater` in batch mode for all 11 measures...
>
> **Maturity Ratings:**
>
> | Measure | Area | Maturity | Level | Confidence |
> |---------|------|----------|:-----:|:----------:|
> | (a) | Risk analysis & IS policy | Repeatable | 2 | high |
> | (b) | Incident handling | Ad-hoc | 1 | medium |
> | (c) | Business continuity & DR | Nonexistent | 0 | high |
> | (d) | Supply chain security | Ad-hoc | 1 | medium |
> | (e) | Secure development & vuln mgmt | Repeatable | 2 | medium |
> | (f) | Effectiveness assessment | Ad-hoc | 1 | medium |
> | (g) | Training & cyber hygiene | Nonexistent | 0 | high |
> | (h) | Cryptography | Defined | 3 | high |
> | (i) | HR security & access control | Repeatable | 2 | high |
> | (j) | MFA & authentication | Ad-hoc | 1 | medium |
> | (k) | Secure communications | Nonexistent | 0 | high |
>
> **Average maturity: 1.2 / 5.0** — Measures at Level 3+: 1 / 11
>
> Three measures are at Level 0, which means no process or documentation exists. As an essential entity, supervisory authorities can conduct proactive audits at any time. Let me compile the findings and build the remediation roadmap.
>
> **You:** Yes, prioritize the remediation. Keep in mind we're an essential entity with the higher fine tier.
>
> **Agent:** Running `remediation_prioritizer` on your 18 findings as an essential entity (fines up to EUR 10M / 2% global turnover)...
>
> **Phase 1 — Foundation (Weeks 1-6):**
> - Update risk analysis policy and conduct formal risk assessment (Measure a) — Critical
> - Review and republish information security policy with management approval (Measure a) — Critical
> - Establish security effectiveness assessment process (Measure f) — High
> - Obtain governance body approval per Art. 20 (Cross-cutting) — Critical
>
> **Phase 2 — Core Measures (Weeks 7-18):**
> - Create NIS2-compliant incident reporting procedure with CSIRT notification (Measure b) — Critical
> - Develop and test incident response plan (Measure b) — Critical
> - Draft business continuity plan and disaster recovery procedures (Measure c) — Critical
> - Formalize supply chain security policy and supplier risk register (Measure d) — High
> - Define HR security policy and formalize access control procedures (Measure i) — High
>
> **Phase 3 — Technical Controls (Weeks 19-28):**
> - Create secure development policy and SDLC integration (Measure e) — High
> - Formalize cryptography standards documentation (Measure h) — Medium
> - Deploy MFA and document authentication standards (Measure j) — High
> - Establish secure communications policy for emergency and routine use (Measure k) — Medium
>
> **Phase 4 — Validation (Weeks 29-36):**
> - Implement cybersecurity training program with management body training per Art. 20 (Measure g) — Medium
> - Conduct effectiveness review of all implemented measures (Measure f) — Medium
> - Full assessment dry run and documentation review (Cross-cutting) — High
>
> Effort breakdown: High — 8 findings, Medium — 7 findings, Low — 3 findings.

## Sample Output Excerpt

Below is a representative excerpt from a generated NIS2 gap assessment showing the measure assessment table and findings register sections.

---

### Measure Assessment (excerpt)

| Measure | Area | Maturity | Key Findings |
|---------|------|:--------:|-------------|
| (a) | Risk analysis & IS policy | 2 | Risk register exists but stale; IS policy not reviewed or communicated in 12+ months |
| (b) | Incident handling | 1 | Basic IRP exists but untested; no CSIRT reporting procedure; no 24-hour early warning process |
| (c) | Business continuity & DR | 0 | No BCP or DR plan; backup policy exists but no recovery testing |
| (d) | Supply chain security | 1 | Major vendors assessed informally; no supplier risk register; no contractual security requirements |
| (e) | Secure development & vuln mgmt | 2 | Vulnerability scanning in place; no secure SDLC policy; patch management partially documented |
| (f) | Effectiveness assessment | 1 | Annual pentest conducted; no formal effectiveness metrics or regular assessment cycle |
| (g) | Training & cyber hygiene | 0 | No security awareness program; no cyber hygiene guidelines; no Art. 20 management training |
| (h) | Cryptography | 3 | Formal crypto policy from ISO 27001 work; encryption standards documented and communicated |
| (i) | HR security & access control | 2 | Access control policy exists; asset inventory maintained; HR security policy not formalized |
| (j) | MFA & authentication | 1 | MFA on some systems; no formal MFA policy; no continuous authentication standards |
| (k) | Secure communications | 0 | No secure communications policy; no emergency communication procedures |

### Findings Register (excerpt)

| # | Measure | Severity | Finding | Remediation Action | Phase |
|---|---------|----------|---------|-------------------|-------|
| F-001 | (a) | Critical | Risk analysis policy stale — not reviewed in 12+ months | Update risk analysis, conduct fresh risk assessment, establish annual review cycle | 1 |
| F-002 | (a) | Critical | IS policy not communicated to staff or approved by management body | Revise policy, obtain Art. 20 governance approval, distribute to all personnel | 1 |
| F-003 | (b) | Critical | No CSIRT incident reporting procedure per Art. 23 | Create 24-hour early warning, 72-hour notification, and 1-month final report procedures | 2 |
| F-004 | (c) | Critical | No business continuity plan or disaster recovery procedures | Develop BCP covering critical services; create and test DR procedures | 2 |
| F-005 | (d) | Major | No supplier risk register or supply chain security policy | Build supplier register with risk ratings; draft supply chain security policy with contractual clauses | 2 |
| F-006 | (g) | Moderate | No cybersecurity training program or Art. 20 management body training | Implement role-based training program; schedule management body NIS2 training | 4 |
| F-007 | (j) | Major | MFA not deployed organization-wide; no authentication standards documented | Define MFA policy covering all critical systems; document authentication standards | 3 |

<!-- /excerpt -->

## Extension Tools

### `mandatory_artifact_detector`

Scans the workspace for 27 expected NIS2 compliance documents across all Art. 21(2)(a)-(k) measures plus cross-cutting artifacts:

| Measure | Artifacts Checked |
|---------|-------------------|
| (a) Risk analysis & IS policy | Risk analysis policy, information security policy |
| (b) Incident handling | Incident response plan, incident reporting procedure |
| (c) Business continuity & DR | Business continuity plan, backup policy, disaster recovery procedures |
| (d) Supply chain security | Supplier risk register, supply chain security policy |
| (e) Secure development | Secure development policy, vulnerability management procedure |
| (f) Effectiveness assessment | Security assessment/audit procedures, security effectiveness metrics |
| (g) Training & cyber hygiene | Cybersecurity training plan, cyber hygiene guidelines |
| (h) Cryptography | Cryptography policy |
| (i) HR security & access control | HR security policy, access control policy, asset inventory |
| (j) MFA & authentication | MFA policy, authentication standards |
| (k) Secure communications | Secure communications policy |
| Cross-cutting | Entity classification report, governance approval records, management training records |

Reports each artifact as **found**, **missing**, or **stale** (not updated in 12+ months). Includes the matched file path and last-modified date for found artifacts.

### `measure_maturity_rater`

Rates maturity (0-5) for Art. 21(2) measures using keyword-based indicator matching and negative-indicator capping:

| Level | Name | Description |
|:-----:|------|-------------|
| 0 | Nonexistent | No awareness, no process, no documentation |
| 1 | Ad-hoc | Informal, reactive, person-dependent |
| 2 | Repeatable | Some consistency, basic documentation, not standardized |
| 3 | Defined | Formal processes, documented, communicated, process owner assigned |
| 4 | Managed | Measured, monitored, KPIs defined, regular review, board-level reporting |
| 5 | Optimized | Continuous improvement, benchmarked, proactive, automated |

**Batch mode** (preferred): pass all measures at once for a summary table with average maturity. **Single mode**: rate one measure at a time. Negative-indicator capping prevents inflation — for example, "no documentation" caps the score at Level 1 regardless of other positive signals.

### `remediation_prioritizer`

Takes a JSON findings file and produces a 4-phase remediation roadmap weighted by entity classification (essential entities face fines up to EUR 10M / 2% turnover; important entities up to EUR 7M / 1.4%):

| Phase | Focus | Measures | Estimated Weeks |
|-------|-------|----------|:---------------:|
| 1 — Foundation | Risk analysis, governance, effectiveness baseline | (a), (f) | 4-6 |
| 2 — Core Measures | Incident handling, BCP, supply chain, access control | (b), (c), (d), (i) | 8-12 |
| 3 — Technical Controls | Secure development, cryptography, MFA, secure comms | (e), (h), (j), (k) | 6-10 |
| 4 — Validation | Training, cyber hygiene, effectiveness review | (g), (f) review | 4-8 |

Prioritization considers measure dependencies (e.g., risk analysis must precede all other measures), severity, current maturity level, implementation effort, and entity-type fine multiplier.

### `gap_dashboard_builder`

Scans workspace for gap assessment outputs and builds structured dashboard data:

- **Overall readiness score** — percentage of measures at maturity Level 3 or above
- **Maturity heatmap** — visual 0-5 rating per measure with bar indicators
- **Artifact coverage** — found vs. total expected compliance documents
- **Severity distribution** — count of findings by severity (critical, major, moderate, minor, observation)
- **Remediation timeline** — phase-by-phase estimated duration with assigned measures

## Getting Started

Activate the *NIS2 Gap Assessment* skill. If you have completed the *NIS2 Entity Classification* skill, load its output — the agent uses your entity type (essential or important) to weight enforcement penalties and prioritize remediation accordingly.

Have this information ready:
- Organization profile — sector, size, services provided, Member State
- Entity classification (essential or important), or let the agent determine it
- Existing security policies, procedures, and governance documentation
- Prior audit reports, certifications, or assessments (ISO 27001, SOC 2, penetration tests)
- Current cybersecurity framework documentation and incident history
- Names and roles of people responsible for cybersecurity governance

The agent guides you through a structured workflow: gather inputs, scan for existing compliance artifacts, assess each Art. 21(2) measure, rate maturity across all 11 measures in batch, compile a findings register, prioritize remediation into a 4-phase roadmap, and generate dashboard data. Each measure gets a maturity rating with actionable next steps to reach the next level.



---

Back to [Skill Library](https://rakenne.app/skills/index.md)
