Culture Isn’t a Mug: Cyber Security Culture as a System, Not a Slogan
“Culture” might be the most misunderstood word in cyber security.
This is a quick deep dive into one of the NCSC cyber security culture principles, designed to help you understand what it actually means in plain English, why it matters in real organizations, and how to spot it in your own world.
If you’re looking for the bigger picture on NCSC culture and how to turn these principles into a real program, you might also like:
Our overview of the NCSC cyber security culture principles and why they matter
How to operationalize the NCSC culture agenda step by step
How to build a 12-month NCSC-aligned cyber security culture roadmap
Measuring cyber security culture with NCSC-aligned metrics that actually work
The NCSC Cyber Culture FAQ: 21 Questions Answered
Use this post to get your head around this principle quickly, then jump into the longer guides when you’re ready to design or evolve your culture program.
The second principle goes straight at psychological safety in security: do people feel safe to report, ask questions, and admit mistakes? NCSC’s wording talks about building safety, trust, and processes that encourage openness around security issues, so that people feel able to raise concerns early. UKAuthority
Why is this in there? Because in the real world, incidents are rarely a single “click” moment—they’re a chain of near misses, doubts, and small worries that people either share or swallow. If they’re scared of blame, reputation damage, or career consequences, they’ll keep quiet until there’s no choice. NCSC is effectively saying: you don’t just need better reporting tools, you need a culture where telling the truth about security feels safe and normal.
NCSC’s trust and openness principle is about one thing:
How safe is it for people to tell the truth when something might be wrong?
In a positive culture:
People report near misses early.
No one is humiliated for asking “basic” questions.
Leaders respond to bad news with curiosity first, blame later (if at all).
If people are scared or cynical:
Incidents are hidden or discovered late.
“Little” issues pile up into big, ugly surprises.
Staff learn that the safest strategy is to say nothing.
Your mean time to know about a problem becomes the real risk.
Ask:
When was the last time someone voluntarily reported a near miss?
Do we have more stories of punishment or learning after incidents?
Would a frontline employee feel safe admitting, “I just clicked something and I’m worried”?
If those answers are uncomfortable, start here.
Change the tone of incident comms: emphasize learning and recovery, not shame.
Have a senior leader share a story about their own mistake and what changed afterward.
Start celebrating early reporting in internal comms (“Caught it early = win”).
We specialize in “no-blame” tone and storytelling:
Characters who mess up, own it, and live to tell the tale
Campaigns that normalize asking for help and reporting early
Baselines that measure psychological safety and trust around security, not just knowledge
“Culture” might be the most misunderstood word in cyber security.
14 min read
Huzzah! NCSC has put cyber security culture firmly on the map. Boards are asking about it, CISOs are being measured on it, and security awareness...
20 min read
You can buy AI tools. You can stand up models. You can write policies. None of that guarantees that AI will be used safely or wisely in real work.
3 min read
Subscribe to our newsletters for the latest news and insights.
Stay updated with best practices to enhance your workforce.
Get the latest on strategic risk for Executives and Managers.