Skip to the main content.
NCSC Cyber Security Culture Principles: What They Are and Why They Matter

NCSC Cyber Security Culture Principles: What They Are and Why They Matter

If your cyber security “culture” lives mostly on a mug, a hoodie and an annual e-learning course… it’s not culture. It’s merchandising. 

We're glad to see how the UK’s National Cyber Security Centre (NCSC) has been gently but firmly raising the bar. In its latest guidance, NCSC talks about cyber security culture as the shared understanding of what’s normal and valued around security in your organization. That understanding is what shapes people’s day-to-day decisions, shortcuts, and “what we do when no one’s watching.”

To help organisations get a grip on this, NCSC sets out:

  • a “cyber security culture iceberg” – a way of seeing what’s really driving behavior, and

  • six cyber security culture principles – the core conditions you need if you want secure behavior to be normal, not heroic.

This article is your guide to that NCSC view of culture:

  • what it actually says

  • what each principle really means in practice

  • how to start using it as a lens for your own organization

…and, at the end, where we at Cybermaniacs strongly agree and where we push the thinking further with our HumanOS™ / “culture as a system” approach.


What Does NCSC Mean by “Cyber Security Culture”?

Let’s get the definition clear first, because this is where a lot of programs already go sideways.

NCSC’s view of cyber security culture is roughly:

The collective understanding of what’s normal, expected and valued when it comes to cyber security – and how that plays out in everyday behavior.

That includes things like:

  • whether people feel safe admitting mistakes,

  • whether security is seen as a partner or a blocker,

  • what people think will happen if they report a near miss, and

  • how leaders talk and act when there’s a security issue.

It’s not just “what people know about security.” It’s the environment they operate in.

Why culture has moved to centre stage

There are a few reasons this is now front-and-centre in official guidance:

  • Most incidents involve humans somewhere in the chain.
    Social engineering, phishing, misdirected emails, misconfigurations, the classic “I’ll just send this to my personal account so I can work from home.”

  • Controls live inside culture.
    Even the best technical control is only as strong as the people who design, use, and work around it.

  • Boards and regulators want to see more than “we trained them.”
    They’re asking: What kind of culture do you have? How do you know? What are you doing about it?

NCSC’s message, translated:

You don’t just have a “tech stack” – you have a human stack as well. Culture is the operating system for that human stack.


The NCSC Cyber Security Culture Iceberg

One of the common metaphors to explain how culture works, and we've used before in our advisory, showed up as a useful ideas in the guidance is the “culture iceberg.” 

Think of everything you do around security as an iceberg:

  • Above the waterline (visible tip = explicit)

    • training and awareness courses

    • phishing simulations and scores

    • policy documents and intranet pages

    • posters, campaigns, security week, swag

    • incident stats and audit reports

  • Below the waterline (the massive bit = implicit)

    • trust and psychological safety

    • norms and unwritten rules (“what people like us do here”)

    • incentives and workloads

    • leadership signals (“what gets rewarded, what gets ignored”)

    • stories people tell about past incidents

    • how easy or painful it is to “do the secure thing”

Most organisations spend 90% of their energy above the waterline. NCSC is politely shouting:

“If you don’t work on the submerged part, you’ll keep getting the same outcomes – no matter how good your slides and phishing tests look.”

At Cybermaniacs, we think of that submerged layer as a stack of three things that all interact:

  • The HumanOS – the messy, wonderful operating system between people’s ears: habits, biases, attention, emotion, curiosity, fatigue, fear.

  • Cyber Safety & Digital Risk Culture – this is our model of how you actually do security day to day: the way work is designed, how decisions get made, how people learn, how incidents are handled, how “secure ways of working” show up in real workflows.

  • Organisational Dynamics – this model tells you who you are as an organization: structure, power dynamics, values, history, politics, leadership style. This is the deep context that means cyber culture is never one-size-fits-all; what works in a high-reliability public service won’t look the same as in a hyper-growth startup.

Put together, that submerged layer is:

HumanOS + how you “do” security + who you are as an organization.

The iceberg is a neat way to remind everyone that unless you work on all three of those deeper layers, you’ll keep fighting the same fires at the surface, no matter how many courses or campaigns you run.


The Six NCSC Cyber Security Culture Principles (in Plain English)

NCSC’s six culture principles are essentially six conditions you need if you want secure behavior to be the norm (aka normal).

Here they are in simple terms, with what “good” looks like in practice.


Principle 1 – Cyber as an Enabler, Not a Blocker

Idea:
Security should help the organization achieve its mission safely, not just say “no” from the sidelines.

What good looks like:

  • Security is involved early in projects and change, not as a grudging sign-off at the end.

  • When people think of the security team, they think “How can we do this safely?”, not “Here comes the team of no.”

  • Security requirements are framed in terms of protecting outcomes – revenue, patients, citizens, services, reputation – not in vague technical or policy terms.

If this principle is weak, you see:

  • Shadow IT and workarounds.

  • Late, grudging security involvement.

  • An unhelpful “business vs security” narrative.


Principle 2 – Trust, Psychological Safety and Openness

Idea:
People must feel able to raise issues, report mistakes and talk about risk openly without fear of blame or punishment.

What good looks like:

  • People report suspicious emails, near misses and “I think I messed up…” early, not after a cover-up.

  • Incident reviews focus on learning, not witch hunts.

  • Leaders and managers model vulnerability – they are willing to say “I got this wrong” or “I nearly fell for that.”

If this principle is weak, you see:

  • Low reporting of near misses.

  • Incidents discovered late via rumor or audit.

  • A “shoot the messenger” attitude to bad news.


Principle 3 – Adapt and Learn from Change

Idea:
Culture should support continuous learning and adaptation as threats, technology and business needs change.

What good looks like:

  • New tech (like GenAI or new SaaS tools) triggers early conversations about safe use, not panicked reactions after something goes wrong.

  • Lessons from incidents and near misses are fed back into updated guidance, processes and training, not buried in PDFs.

  • People see security as something that evolves, not a static rulebook from 2017.

If this principle is weak, you see:

  • Policies and training that feel out-of-date.

  • The same kinds of incidents repeating, because no one learned from the last one.


Principle 4 – Social Norms that Support Secure Behavior

Idea:
What really drives behavior is often social norms – “what people like us do here” – not formal rules.

What good looks like:

  • It feels normal in your organization to:

    • lock screens,

    • challenge odd requests,

    • report suspicious messages,

    • use approved tools.

  • When someone spots a risk, colleagues support them, not roll their eyes.

  • Teams have local norms that reinforce secure behaviors (e.g. “we always check with each other before changing payment details”).

If this principle is weak, you see:

  • Workarounds as badge-of-honor (“We all share logins, it’s just easier.”)

  • People teasing or isolating colleagues who “make a fuss” about security.

  • A big gap between “what the policy says” and what actually happens.


Principle 5 – Leaders Own Their Impact on Culture

Idea:
Leaders at every level shape culture through what they say, do, allow and ignore. They don’t get to opt out.

What good looks like:

  • Leaders take their own security responsibilities seriously (no “skip training” passes).

  • Leaders talk about security in terms of business risk and resilience, not just compliance.

  • Security culture indicators show up in board and executive conversations – not just in IT risk reports.

If this principle is weak, you see:

  • Senior people seeking exemptions from basic controls.

  • Mixed messages: “Security is important… but just get this deal done.”

  • Culture initiatives run by security or HR with no visible senior backing.


Principle 6 – Clear, Usable, Flexible Rules and Guidance

Idea:
Security rules and guidance should be easy to understand, easy to follow and adaptable enough to fit real work.

What good looks like:

  • People can quickly answer: “What am I supposed to do in situation X?”

  • Security processes (e.g. access, approvals, exception handling) are simple, timely and transparent.

  • Policies are written in plain language, with examples, decision trees and FAQs – not just dense PDFs.

If this principle is weak, you see:

  • People ignoring or bypassing controls because they’re confusing or slow.

  • “Because audit said so” as the only explanation for rules.

  • Shadow processes and tools that feel easier than the secure path.


How to Use the NCSC Culture Principles (Without Drowning in Them)

The point of these principles is not to give you six more boxes to tick. They’re meant to be a lens you can use to interrogate your current reality and design better interventions.

Here are some practical ways to start using them:

1. Use them as a diagnostic checklist

Take a recent incident, near miss, or frustrating pattern (like constant workarounds) and ask:

  • Which principle(s) were lacking here?

  • Was this really about “people not following rules”… or about:

    • unclear guidance (P6),

    • lack of trust in reporting (P2),

    • leaders sending mixed signals (P5), or

    • security seen as blocker so people went around it (P1)?

You’ll almost always find that multiple principles are in play.

2. Map your current activities to the principles

Grab your existing activities:

  • annual training and refreshers

  • phishing and simulations

  • policies and intranet content

  • internal campaigns, security week, town halls

Now map them:

  • Which principle(s) does each activity actually support?

  • Where are you overloading (e.g. tons of activity above the waterline) vs underinvesting (e.g. nothing that touches leadership behaviors or process usability)?

This often reveals big blind spots – for example, many programs do a lot for awareness but almost nothing for trust and openness.

3. Turn the principles into outcomes and metrics

For each principle, write:

  • 2–3 outcome statements (e.g. “People feel safe reporting cyber mistakes promptly.”)

  • 1–3 ways you could measure progress:

    • survey questions,

    • behavioral metrics (e.g. reporting rates),

    • process metrics (e.g. time to get a secure path approved).

You now have the beginnings of an NCSC-aligned culture scorecard.


Where We Strongly Agree with NCSC… and Where We Push Further

We spend all day living and breathing cyber security culture, so here’s our honest take on the NCSC guidance.

What NCSC absolutely nails

We think NCSC has got some big things exactly right:

  • Culture is not just training.
    NCSC is explicit that controls and awareness sit inside a wider culture. That’s a huge step forward.

  • The iceberg metaphor.
    It’s a simple way to explain to executives why “more posters and modules” won’t fix underlying issues.

  • Clear focus on trust, leadership and usability.
    The principles on psychological safety, leadership and usable rules highlight the parts organisations usually avoid, because they’re harder than buying another tool.

  • Principles, not prescriptions.
    The guidance doesn’t try to give you a one-size-fits-all checklist. It gives you a way of thinking about your culture, which is exactly what you need in complex environments.

Where we push further at Cybermaniacs

We’re totally aligned on the direction of travel – and we extend it in a few key ways:

1. From “culture” to HumanOS™ and the psychological perimeter

We think of humans as endpoints with their own operating system (the HumanOS):

  • habits

  • biases

  • emotions

  • social wiring

  • energy levels

  • stories they believe

Attackers target this OS, your cognitive security layer, directly. Your culture is the configuration of that OS at scale.

So we design interventions that:

  • reduce cognitive load,

  • play to human strengths (curiosity, humor, storytelling),

  • and recognize the psychological perimeter is where the real contest happens – not just at a firewall.

2. From abstract culture to culture as a system

NCSC’s principles implicitly point to a systems view. We make that explicit. Culture is not a vibe; it’s a system made up of:

  • norms and expectations

  • incentives and consequences

  • processes and tools

  • stories and symbols

  • leadership signals

If you change only one part (like adding a poster) and ignore the rest, the system snaps back. Especially under pressure. Isn't that the resilience we are looking for?

So our work always asks:

“What does the system reward? What does it punish? What does it ignore?”

…and we design programs that change the system, not just the slogans.

3. From awareness programs to Human Engagement, Risk and Resilience Operations (HERRO)

We love a good piece of creative content, (see: puppets) but we don’t stop there.

What you might not know about us, is that our strategic advisory and program services helps organisations move from:

  • “We run training and phishing” to

  • “We run continuous human risk operations at scale

That means:

  • baselining human risk and culture,

  • prioritizing by business impact,

  • designing targeted interventions,

  • measuring behavioral and cultural shifts, and

  • closing the loop with leadership and the board.

Think: SOC for human risk, not just “LMS admin.”


Key Takeaways and Next Steps

If you’ve made it this far, here’s the short version to carry into your next internal conversation:

  • NCSC cyber security culture is about the shared expectations, norms and behaviours around security – not just training.

  • The culture iceberg tells you that if you only work on the visible tip (courses, comms, phish tests), you’ll keep tripping over the submerged bit (trust, norms, incentives, leadership, process).

  • The six NCSC culture principles give you a powerful lens to:

    • diagnose where things go wrong,

    • design better interventions, and

    • measure progress over time.

  • To really make use of them, you need to treat culture as a system:

    • connect tech, people, process, stories and leadership,

    • align them with your mission and risk landscape,

    • and run human risk as an ongoing operation, not a one-off awareness blitz.

What you can do in the next 30 days

Pick one or two of these:

  1. Run a quick culture pulse with a handful of NCSC-aligned questions about trust, leadership and usability. (We can help, ask us how.)

  2. Take a recent incident and map it against the six principles – what was really driving it?

  3. Map your current programs (training, phishing, comms) to the principles and see where you’re thin.

  4. Sketch a rough 12-month NCSC culture roadmap: baseline → quick wins → deeper integration → assurance.

And if you’d like a partner in crime to help with the heavy lifting – baselines, playbooks, creative content and ongoing human risk ops – that’s literally what we do all day at Cybermaniacs. Get in touch! 

More from the Trenches!

We've Got You Covered!

Subscribe to our newsletters for the latest news and insights.