Skip to the main content.
Measuring Cyber Security Culture: NCSC-Aligned Metrics That Actually Work

Measuring Cyber Security Culture: NCSC-Aligned Metrics That Actually Work

It's been a long time coming in cybersecurity, but I think we can safely say that everyone (finally!) agrees that culture matters. What we've found working with organizations around the world is that very few can measure it in a way that stands up in front of a CFO, regulator, or board.

The UK’s National Cyber Security Centre (NCSC) talks about cyber security culture as the collective understanding of what’s normal and valued in the workplace with respect to cyber security – something that shapes behavior, trust, collaboration, and learning. NCSC

It also emphasizes that building this culture isn’t a one-shot project; you need feedback loops and metrics so you can see what’s working, adjust, and sustain progress over time. Great, we agree. However... 

The problem is:

  • Most orgs track activity, not culture.

  • Metrics live in silos – LMS, phishing platform, service desk, HR – and nobody connects the dots.

  • Boards get presented with either vanity numbers (“98% completion!”) or a firehose of detail.

Good news, we're here to help. This article is a practical guide to measuring cyber security culture in a way that:

  • matches NCSC’s culture principles and “culture iceberg” view NCSC

  • connects HumanOS, your cyber safety & digital risk culture model, and your organizational dynamics model 

  • gives CISOs and awareness leads metrics they can actually manage and explain

We’ll cover:

  1. Why traditional “awareness metrics” aren’t enough

  2. A simple 3-layer measurement stack

  3. Example NCSC-aligned metrics (with concrete questions and KPIs)

  4. How to build a culture scorecard that an exec can understand

  5. How to start small in 90 days

Why “Awareness Metrics” Aren’t Enough

If your culture reporting currently consists only of: training completion rates, phishing click and report rates, and maybe a few policy exceptions then you’re mostly seeing the tip of the iceberg NCSC talks about.

Those numbers matter, but they don’t tell you:

  • if people feel safe reporting mistakes,

  • if leaders model secure behavior,

  • if people see security as an enabler or a blocker,

  • if your processes and policies make secure choices easier or harder,

  • if social norms support or undermine secure behavior.

In other words, they don’t tell you if the conditions for empowered security culture are actually present. We call these enabling factors. 

At Cybermaniacs, we think of the Human Risk ecosystem (for lack of a better term) in a three-part stack:

  • HumanOS – Your pscyhology, knolwedge, and behavioural Layer. The attitudes, attention, emotions, habits, and mental models within your workforce today. 

  • Cyber Safety & Digital Risk culture model – how you “do” security: processes, workflows, incident handling, learning loops

  • Organizational dynamics model – structure, incentives, power, history, leadership style- all the things that make your company unique in how it runs, competes, and provides value. 

If your metrics only look at outputs (e.g., “how many people passed training”), you’re blind to what’s happening in that stack.

NCSC’s culture principles are basically saying:

“Stop just measuring the outputs. Start measuring the conditions that drive secure behavior.”

So let’s do that.


A Simple 3-Layer Measurement Stack (NCSC-Friendly)

You don’t need an army of data scientists. (Yes, it would be nice to have them, of course, but let's be honest. Most HRM teams are struggling to find time, team resources, budget and tools.) You can start small and look at culture from three angles:

  1. Perception & Climate – how people feel and what they believe

  2. Behavior – what people actually do

  3. Operations & Structure – how the system is set up around them

These layers sit right on top of the culture iceberg concept NCSC uses: visible behaviors and processes above the waterline, deeper beliefs, norms, and experiences below it.

Let's walk through each layer and then pull it together into a scorecard.


Layer 1: Perception & Climate Metrics

Perception metrics are often dismissed as “soft,” but they’re essential. They tell you what the HumanOS and social climate look like right now.

Key NCSC-aligned areas to measure:

1. Trust & Psychological Safety (Principle: Safety, Trust & Openness)

Sample pulse questions (Likert scale: strongly disagree → strongly agree):

  • “I feel safe admitting a cyber-related mistake at work.”

  • “If I reported a cyber incident or near miss, I believe I would be treated fairly.”

What to look for:

  • Baseline levels by team/function

  • Trends quarter over quarter

  • Gaps between teams

These questions line up directly with NCSC’s call for safety, trust, and open reporting as foundations for healthy cyber culture. NCSC


2. Security as Enabler vs Blocker (Principle: Cyber as Enabler)

Questions:

  • “In my experience, our security processes help me do my job safely.”

  • “When I involve security early, it makes it easier to deliver good outcomes.”

If the responses skew negative, it’s a red flag for NCSC’s first principle: security may be perceived more as a compliance anchor than an enabler of the organization’s goals. UKAuthority


3. Leadership Signals (Principle: Leaders Own Their Impact)

Questions:

  • “Leaders in my area lead by example on security behaviors.”

  • “When deadlines and security requirements conflict, leaders support doing the secure thing.”

This gives you a climate view of NCSC’s leadership principle: are leaders actually seen as part of the culture solution, or as people who quietly exempt themselves? Industrial Cyber


4. Clarity and Usability of Rules (Principle: Clear, Usable Guidance)

Questions:

  • “I know where to find clear guidance on what to do in common cyber-related situations.”

This tells you whether your policy and process design are supporting or sabotaging secure behavior – a key piece of NCSC’s guidance.


Layer 2: Behavior Metrics

Perception is what people say and feel. Behavior is what they do under real conditions

Useful NCSC-aligned behavior metrics include:

1. Reporting Behavior

  • Volume of phishing reports, suspicious emails, policy concerns, near-miss self-reports.

Metrics to track:

  • Ratio of self-reported incidents vs incidents discovered by tooling/audit

  • Time from event → first report

These map directly to a culture where people feel safe and see themselves as part of protecting the organization, not just as potential culprits.

2. Behavior in Simulations (With Context)

Phishing simulations, social engineering tests, and scenario-based exercises are useful if you don’t over-interpret them.

Metrics:

  • Click rates, credential submission rates

  • Report rates and time-to-report

  • Improvement by cohort over time

Critical: interpret these in context:

  • A fall in click rate plus a rise in reports is a good sign.

  • A fall in click rate plus no change in reporting might just mean people are gaming the test or the phish is too obvious.

3. Everyday Secure Behaviors

Depending on your tooling and privacy constraints, you may be able to see:

  • MFA adoption and usage

  • Password manager adoption

  • Secure file-sharing usage vs shadow tools

  • Use of approved collaboration platforms

This is where your Cyber Safety & Digital Risk culture model comes in: you’re not just counting log events; you’re asking,

“Are people using the secure way of doing X that we designed, or are they inventing their own?”


4. Evidence of Learning and Adaptation

NCSC stresses continuous learning and adaptation as cyber threats evolve. Fantastic, so do we. 

You can measure this behaviorally by tracking:

  • % of incidents and near misses that lead to:

    • an update in guidance,

    • a process change,

    • a training or comms tweak.

  • Time from: incident → lessons captured → visible change shipped.

If you see lots of incidents but very few changes, your culture is stuck in “copy-paste last year’s playbook” mode.


Layer 3: Operations & Structural Metrics

This is the bit most culture discussions ignore, and NCSC implicitly cares about through its principles and broader cyber governance mapping. GOV.UK

Here, you’re asking: How is the system wired? Are our structures, processes, and incentives aligned with the culture we say we want?

Examples:

1. Early Security Engagement in Projects

2. Process Friction & Exceptions

3. Leadership and Governance

4. HR & People Integration


Putting It Together: An NCSC-Aligned Culture Scorecard

Now let’s assemble these pieces into something you can actually show to leadership.

Think of a one-page scorecard with four sections:

  1. Headline culture narrative (3–4 bullets)

  2. Principle-level view (NCSC lens)

  3. Key metrics (perception, behavior, operations)

  4. Actions & experiments (what you’re doing about it)

1. Headline Culture Narrative

Example:

  • Trust and reporting culture is improving, but still fragile in frontline operations.

  • Security is increasingly seen as a partner in product and tech teams, but as a blocker in finance. 

Short, honest, data-backed.

2. Principle-Level View

For each NCSC principle, you show:

  • Status (e.g., green/amber/red or trending arrows)

  • 2–3 supporting metrics

  • One key story or insight

For example, Principle: Build Safety, Trust & Processes to Encourage Openness:

  • Status: Amber, trending up

  • Metrics:

    • 68% agree “I feel safe admitting a cyber mistake” (+10 pts YoY)

    • Near-miss self-reports up 35% YoY

    • 75% of incidents now have a documented “what we changed” action

  • Insight:

    • Still a big gap between HQ (75% agreement) and call centers (52% agreement)

This format makes it very clear how you’re using NCSC’s principles as a measurement lens, not just a poster.


3. Key Metric Table

You don’t need 50 metrics. For execs, pick ~10–15 that best tell the story.

Example:

Area Metric Q2 2025 Q3 2025 Direction
Perception “I feel safe admitting a cyber mistake” – % agree 55% 63%
Behavior Near-miss reports per 100 employees (quarter) 2.1 3.4
Operations Average time to approve secure vendor access (days) 18 11

The arrows make it easy to see where your interventions are working.


4. Actions & Experiments

Metrics are pointless if they don’t drive action. For each principle or problem area, list 1–2 current or upcoming experiments. For example:

  • Trust & Openness:

    • Shifted incident comms to emphasize learning; piloting “near miss of the month” recognition in Customer Support.

  • Usable Rules:

    • Redesigned data-sharing process with user input; measuring completion time and satisfaction in Q4.

  • Leadership Ownership:

    • Running a leadership simulation focused on cyber culture trade-offs in Q1 next year.

This shows NCSC, auditors, and internal stakeholders that you’re treating metrics as feedback for a learning system, not just scorekeeping.


How to Start Measuring in 90 Days (Without Boiling the Ocean)

You don’t need a massive transformation to begin. Here’s a 90-day starter plan. (and if you need help jumpstarting this program, give us a call.) 

Days 1–30 – Define Your Lens and Questions

  • Pick 2–3 NCSC principles to focus on first (e.g., Trust & Openness, Leadership, Usable Rules).

  • Draft: 2–3 perception items per principle and a shortlist of behavioral and operational metrics you already have data for (e.g., incident patterns, phish data, process times).

Days 31–60 – Run a Pulse and Build a Rough Scorecard

  • Launch a short culture pulse survey (10–15 questions) to a representative subset or the whole org.

  • Pull the supporting behavior and operations data into one place.

  • Build a rough version of the NCSC-aligned scorecard.

Focus on direction and insights, not perfection.

Days 61–90 – Share, Learn, and Tweak

  • Share early findings with your security culture working group, CISO / security leadership, and HR / People partners.

  • Agree on 3–5 metrics you’ll track every quarter, 2–3 experiments to run in response, and a regular quarterly review slot in governance.

  • From there, you iterate: measure → learn → adjust → measure again. (That's operations!) 


How Cybermaniacs Fits In (Optional, But Handy)

We usually help organizations who:

  • are under pressure (from leadership, regulators, incidents) to show they’re treating culture seriously, and

  • don’t have the time / internal expertise to design HumanOS + cultural + structural metrics from scratch.

We typically plug in by:

  • Designing and running a full baseline or culture pulses (mapped explicitly to the culture principles and the iceberg).

  • Combining survey, behavioral, and process data into a coherent HumanOS + cyber safety & digital risk + org dynamics picture.

  • Building the first culture scorecard and coaching you on how to use it with leadership and the board.

  • Helping you connect metrics to interventions and experiments, not just dashboards.

The aim: help you move from “we track a few awareness stats” to “we run a measured cyber culture system aligned with NCSC.”


Key Takeaways

  • NCSC’s cyber security culture principles are not just a design guide—they also give you a measurement lens for perception, behavior, and structure. 

  • Good culture metrics look at:

    • how people feel and think (climate and trust),

    • what they actually do (reports, behaviors, learning),

    • how the system is wired (process friction, leadership, HR integration).

  • You don’t need dozens of metrics; you need a small, stable set that links clearly to the NCSC principles and your own HumanOS/culture/dynamics models.

  • The real value isn’t the numbers themselves, but the conversations and decisions they unlock: where to focus, what to fix, and how to prove progress.

Start small. Measure what matters. And treat every graph and trend as a clue about the human operating system your controls live inside.


 

More from the Trenches!

How to Build a 12-Month NCSC Cyber Security Culture Roadmap

How to Build a 12-Month NCSC Cyber Security Culture Roadmap

Ok. Here's where we are. You’ve read the NCSC cyber security culture guidance. You’ve nodded along with the six principles. You might even have a few...

26 min read

How to Measure the ROI of Security Awareness and Human Risk Programs

How to Measure the ROI of Security Awareness and Human Risk Programs

TL;DR? Measure outcomes, not activities. Boards don’t buy “courses completed”; they buy fewer incidents, faster recovery, and lower loss. Track...

7 min read

We've Got You Covered!

Subscribe to our newsletters for the latest news and insights.