Vakteye Logo
VAKTEYE
PRICINGABOUTCONTACTINSIGHTSCAREERS
Sign In
Back to Insights
COMPLIANCE

NIS2 audit Sweden: what MCF and PTS actually examine

Vakteye Team/May 6, 2026/11 min read

Sweden's Cybersecurity Act (SFS 2025:1506) entered into force on January 15, 2026, and the self-registration window at MCF (Myndigheten för civilt försvar — Authority for Civil Defence, formerly MSB) closed on February 16. That means the supervisory phase has already begun. The question is no longer if an audit is coming — it's what you can produce when it does.

This article is an audit playbook. We walk through what MCF and PTS actually ask for during a NIS2 supervisory action, which of Article 21(2)'s ten security measures are technically verifiable today, and where organizational documentation takes over. Every claim here is anchored to a statute, an EDPB guideline, or a public supervisory decision.

What a NIS2 audit actually is

A NIS2 audit in Sweden is not a checklist. It is an evidence-gathering exercise. Under Article 32 of the NIS2 Directive, MCF and sector-specific authorities like PTS (Post- och telestyrelsen, the postal and telecom regulator) have the right to demand three things: documentation of your security measures under Article 21(2), proof that the measures are implemented, and proof that they are effective.

The first two are paperwork — risk analysis, incident plans, supply-chain agreements. The third is where most organizations get stuck. "Effective" means the measure actually works in production, not just that it is documented. For digital services, this typically means the supervisor demands technical evidence: configurations, logs, scan results, external auditor reports.

The difference between a policy and evidence: a policy says "we use TLS 1.2 or higher." Evidence is a current scan report showing your production domain actually rejects TLS 1.0/1.1 connections. MCF will not accept the policy without the evidence.

Article 21(2): the ten measures, sorted by audit method

Article 21(2) of NIS2 lists ten categories of security measures. All are mandatory for essential and important entities. But they are tested in different ways — four are verifiable by automated website scanning, six require organizational documentation. Conflating them is one of the most common reasons a supervisory action drags on.

Technically verifiable measures (scanning suffices)

  • (d) Supply-chain security — third-party scripts, CDN dependencies, CNAME cloaking, SBOM for published applications
  • (e) Network and system security — TLS configuration, HTTP security headers, open ports, CVE exposure
  • (g) Cyber hygiene — cookie handling, consent implementation, data minimization, patch posture
  • (h) Cryptography — certificate validity, cipher suites, HSTS, mixed content

These four can be verified by an auditor from outside, without access to your internal systems. They are also the four that Vakteye's scanner maps directly to Article 21(2). When an audit begins, the auditor will often run their own scan in parallel with your evidence — if the results diverge, that is a red flag.

Organizational measures (require documentation)

  • (a) Risk analysis and information security policies
  • (b) Incident handling — procedures, escalation chains, exercise
  • (c) Business continuity and crisis management — BCP, DRP, RTO/RPO
  • (f) Strategies to assess the effectiveness of cybersecurity measures
  • (i) Personnel security, access control, asset management
  • (j) Multi-factor authentication and secured communications channels

For these six you need documented processes: written policies, an asset register, access matrices, MFA coverage reports from your identity system. No website scan verifies them. When MCF asks "how do you test that MFA works?" the answer is a report from your IdP, not a scan.

Article 23: the reporting chain every audit tests

Article 23 governs incident reporting and is the first item on MCF's 2026 supervisory checklist. The reason is simple: the deadlines are so short that an organization either has a rehearsed routine or it misses them.

  1. 24 hours — Early warning to CERT-SE via iron.mcf.se. What happened, suspected cause, whether the incident has cross-border impact.
  2. 72 hours — Full incident notification. Severity, impact, indicators of compromise (IoC), updated assessment.
  3. 1 month — Final report. Root-cause analysis, mitigation actions taken, cross-border consequences.

MCF will ask: when did your most recent incident occur, who alerted, what channel was used, how long did it take? If the answer is "we haven't had an incident," the follow-up is: "how do you test your reporting chain?" One tabletop exercise per quarter is the minimum the regulator expects from an essential entity.

iron.mcf.se is CERT-SE's reporting portal. Verify now that the right people have accounts, that the escalation chain is documented, and that the 24-hour clock is built into your incident-handling template. That's five minutes of work that can save you a fine.

Fines: the numbers that actually bite

NIS2 follows GDPR's penalty model. The maximum is the higher of a fixed amount or a percentage of global annual turnover. For a Swedish organization with EUR 100M revenue, the caps are concrete:

Essential entities: up to EUR 10,000,000 or 2% of global annual turnover, whichever is higher. Important entities: up to EUR 7,000,000 or 1.4% of global annual turnover.

— NIS2 Directive Article 34, implemented in Cybersäkerhetslagen (SFS 2025:1506)

But the fine is not the most serious consequence. NIS2 grants MCF the power to issue binding orders, mandate external security audits at your expense, and — for essential entities — temporarily suspend management responsibilities until measures are taken. Executives can be held personally liable under Article 20(1).

Six questions MCF will ask

Based on MCF's published supervisory plan and the patterns visible in early-2026 inquiries, a typical first-pass audit will include some variant of the following:

  1. 1. "Show your most recent risk analysis under 21(2)(a). When was it updated, by whom, and how is the threat model linked to your measures?"
  2. 2. "Show your incident-handling process under 21(2)(b). When was it last exercised, and what observations led to changes?"
  3. 3. "Show how you secure the supply chain under 21(2)(d). Which third-party providers do you use, what security requirements are in your agreements, and how do you follow up?"
  4. 4. "Show configuration evidence for your public services under 21(2)(e) and (h). TLS version, security headers, certificate lifecycle."
  5. 5. "Show your MFA coverage under 21(2)(j). Which accounts, which factors, what coverage percentage?"
  6. 6. "Show your continuity program under 21(2)(c). RTO, RPO, last restoration exercise."

Note the pattern: every question expects evidence, not a policy. "Show," not "describe."

What you can test yourself this week

Before an audit begins, run an internal check against Article 21(2)(d), (e), (g) and (h). You don't need an expensive consultant. Here's what you can do in an afternoon:

  • Run an external TLS scan against your production domain — Mozilla Observatory or Qualys SSL Labs satisfies 21(2)(h). Target: A or A+, no TLS 1.0/1.1.
  • Check HTTP security headers for 21(2)(e). Strict-Transport-Security, Content-Security-Policy, X-Frame-Options, Referrer-Policy.
  • List every third-party script that loads on your public site for 21(2)(d). Every non-first-party domain should have a DPA and a documented security requirement.
  • Check cookie behavior for 21(2)(g). Which cookies are set before consent? Does tracking actually stop on reject? This is where NIS2 overlaps with GDPR and LEK 9 ch. §28.
  • Verify iron.mcf.se accounts exist for the CISO and a designated alternate. If you cannot log in within 24 hours, you are not 24-hour-ready.

If any of the above fails, that is not yet a NIS2 problem — it is a gap report you found yourself before the auditor did. That is worth a lot.

Run a 90-second NIS2 baseline scan

Vakteye's scanner checks the four technically verifiable measures in Article 21(2) and maps every finding to the statute, the fine cap, and MCF's supervisory focus. No accounts, no sales calls — just findings, anchored to evidence.

Scan your domain

Evidence matrix: what the regulator wants per measure

Once the internal check is done, map each measure to a concrete evidence type. MCF accepts three categories: a self-produced report, an internal log, or an external verified report (preferably with cryptographic signing). An external signed report carries the most weight — that is why Vakteye produces them by default.

  • 21(2)(a) Risk analysis → ISO 27005-based document, dated within 12 months, board-approved
  • 21(2)(b) Incident handling → SOPs + at least one tabletop-exercise report per quarter
  • 21(2)(c) BCP → BCP/DRP document + most recent DR test with RTO/RPO measurement
  • 21(2)(d) Supply chain → vendor register + DPA + most recent third-party scan
  • 21(2)(e) Network/system → configuration baselines + most recent external security scan
  • 21(2)(f) Effectiveness → KPIs, scorecards, maturity assessment (CMM level)
  • 21(2)(g) Cyber hygiene → training statistics + most recent cookie/consent scan
  • 21(2)(h) Cryptography → crypto inventory + cert lifecycle + scan report
  • 21(2)(i) Personnel security → IAM report, access matrix, off-boarding logs
  • 21(2)(j) MFA → IdP report showing coverage per user class

Where Vakteye helps, and where it doesn't

Vakteye is an automated scanner. It is honest about what it can and cannot do. It covers (d), (e), (g) and (h) of Article 21(2) — the quarter of NIS2 that is actually verifiable from outside without internal-system access.

For the six organizational measures, Vakteye produces a structured gap template in the compliance report, not evidence. You fill in your own documentation there. Anyone selling you a "complete NIS2 solution" out of an automated scanner is lying — or has misread the directive.

What Vakteye does that no GRC platform does: when we find a TLS 1.0 connection, a missing CSP header, or a cookie set before consent, we link it directly to Article 21(2) (for NIS2), GDPR Article 32 (for security of processing), and the relevant Swedish implementation. You don't just get a technical finding — you get the fine exposure.

Verifiable fact: Cybersäkerhetslagen is published as SFS 2025:1506 in the Swedish Code of Statutes. In force since January 15, 2026. The MCF self-registration window closed February 16, 2026. Supervisory activity is live from Q3 2026 onward.

Bottom line

A NIS2 audit in Sweden is not decided by what you have on paper. It is decided by whether you can show working measures with externally verifiable evidence when MCF or PTS asks. The paperwork is the foundation, but the evidence is what gets you through the audit.

The four technically verifiable measures — supply chain, network, cyber hygiene, cryptography — are where most organizations can start. They are also where an automated scanner can give you an evidence-based baseline in an afternoon. The rest is documentation and exercise, but you cannot start with the documentation if your technical baseline is broken.

NIS2 evidence, not just checklists

Vakteye delivers cryptographically signed compliance reports mapped to Article 21(2). The structure MCF auditors expect, the evidence that holds up.

Start your NIS2 assessment

Are you at risk?

Get your free compliance report

We scan your site live and show you exactly which risks are exposed — before IMY finds them.

Book demo · free scan
Previous

"But We Already Have a Compliance Tool": Where Vakteye Fits

Next

Meta Pixel GDPR-sanktioner i Sverige: 85 miljoner kronor, fem beslut, ett mönster

Related Articles

COMPLIANCE5 min read

IMY's Cookie Crackdown: What ATG, Aller Media & Warner Music Mean for You

In April 2025, IMY issued its first formal cookie banner decisions against three Swedish companies. The violations were textbook dark patterns, and your site probably has the same ones.

COMPLIANCE7 min read

NIS2 is here: Sweden's cybersecurity act since January 2026

Sweden's NIS2 implementation (Cybersäkerhetslagen) is live since January 15, 2026. No grace period. Here's what it requires and what happens if you ignore it.

COMPLIANCE6 min read

EDPB 2026: Why Transparency Enforcement Hits Swedish Businesses

The EDPB's 2026 coordinated enforcement focuses on transparency. Organizations should prepare for any 2026 EDPB coordinated enforcement framework by ensuring transparency mechanisms (Art 13/14 disclosures) are current and verifiable.

COMPANY

  • PRICING
  • ABOUT US
  • CONTACT
  • INSIGHTS
  • info@vakteye.com

LEGAL

  • Privacy Policy
  • Terms of Service
  • Cookies Policy
  • Data Rights (GDPR)
  • Security policy
  • Scanner identity
Vakteye
VAKTEYE

Evidence ledger for GDPR, NIS2 and ePrivacy. Every finding tied to a statute and signed by an analyst.

Vakteye
Privacy VerifiedContinuously monitored by Vakteye

© 2026 Vakteye AB. All rights reserved.