TL;DR
While organisations pour billions into cyber security technologies, they remain vulnerable because they've built their defences on unstable foundations.
Jane Frankland MBE (Cyber Security Influencer, Advisor, and Author) and Dan Haagman (CEO of Chaleit) argue that true cyber resilience requires a radical change from technology-first to people-first methods, establishing what Jane terms the "Cyber Maslow Hierarchy" — a five-layer framework beginning with leadership and culture, progressing through governance, risk and compliance (GRC), defence, and community.
Unlike aviation, where technology is specifically designed to liberate human cognitive capacity for critical decision-making, cyber security tools overwhelm and distract rather than empower, creating cosmetic metrics while leaving organisations unprepared for real incidents. Organisations need deliberate practice, psychological safety, and the recognition that in an AI-accelerated world, humans remain invaluable.
Context: The resilience paradox in cyber security
Despite unprecedented investment in cyber security technologies, organisations remain systemically vulnerable to attacks.
Data shows global cyber security spending has risen sharply over the past years. In 2018, annual spending stood at $115 billion. By 2029, annual spending is forecast to hit $272 billion.
However, the number and impact of breaches have increased. In 2023, there were 3,205 publicly reported data compromises affecting over 353 million individuals, 78% more than in 2022. The average cost of a data breach reached $4.88 million in 2024, up 10% from the previous year.
The discrepancy between investments and actual security shows that the industry has conflated security tools with resilience, creating a dangerous illusion of protection while core vulnerabilities persist.
Drawing from her extensive consulting experience, Jane observes that completely preventing cyber attacks is not feasible, nor is it expected to become so. The focus should instead be on reducing their frequency and enabling faster, lower-impact recovery from those that do occur. The industry must move from prevention to recovery, and from perfection to pragmatism.
Dan observes a disconnect between understanding resilience in theory and effectively implementing it in practice: organisations grasp the concept but falter in execution due to a misunderstanding of what genuine resilience demands.
Traditional methods of adding more technology to solve technology problems have reached diminishing returns, the authors note.
The question we must answer is not whether to invest in technology or people. It's whether to build technology programs on strong human foundations or continue constructing elaborate technical edifices on unstable ground.
Challenges
Organisations struggle with the disparity between desired and actual resilience for four main reasons: unstable human foundations, overwhelming tools, superficial exercises, and misaligned incentives. These problems explain why even well-funded security efforts fail during real incidents, and they point to the need for fundamental changes. Let’s explore each in detail.
Challenge #1: Building on unstable foundations
Most organisations tackle cyber security by starting with technology and working backwards, neglecting the people and leadership foundations that determine whether any security program will succeed or fail. It's an inverted priority system that creates a systemic weakness in organisational defence.
Jane's framework maps directly to Maslow's hierarchy of needs, creating the "Cyber Maslow Hierarchy" (detailed more broadly in Jane’s blog and the Solutions section below):
Leadership (physiological needs)
Culture (safety needs)
GRC (belonging needs)
Defence (esteem needs)
Community (self-actualisation)
In contrast to cyber security, other industries manage foundations better. In aviation, for example, pilots undergo mental health assessments and rigorous training, setting them up for success. Cyber security lacks that and is, in a way, "intoxicated", Jane warns.
Dan explains that "aviation's goal and imperative is safety. Corporate imperative is profit. And they're diametrically opposed." This tension explains why organisations struggle to prioritise the long-term investments in human capability that resilience requires.
The people layer encompasses far more than traditional security awareness training. As Jane emphasises:
"People are both the greatest threat, due to software design errors, and our strongest shield."
This duality requires sophisticated management, building what she calls "psychologically safe teams, where your security team can ask questions, they feel safe to do that."
Challenge #2: The cognitive overload crisis
Current cyber security tools create information overload rather than decision support, forcing practitioners to manage overwhelming data streams while making critical decisions under pressure. In this respect, cyber security should learn from other high-stakes industries where technology serves to liberate rather than burden human cognition.
"Cyber security tools are not freeing people of the burden. And thus they cannot be resilient," Dan believes. In aviation, for example, technology design follows different principles entirely. Pilots can operate a jet with minimal information: "Are my wings level? What attitude am I flying? What's my height? And what's my rate of descent? And where am I going with a compass? That's it." This intentional minimalism reduces cognitive load, enabling higher-order thinking.
Jane agrees:
"The tools could work if they were being used better. But people are overwhelmed. There's too much there. It's information overload."
The current state of security tools creates what psychologists term "cognitive overload", a state where information volume exceeds processing capacity, leading to degraded decision-making.
This challenge is exacerbated by "alert fatigue": security teams become desensitised to warnings because systems generate too many false positives. Unlike aviation's "essential six" instruments, which provide clear, actionable information, cyber security dashboards often present hundreds of metrics without clear prioritisation or context.
For example, data shows that average SOC teams receive over 11,000 alerts per day, with analysts spending nearly 70% of their time investigating and triaging these alerts. However, in one survey, 43% of respondents said that 40% of alerts were false positives, despite 95% expressing confidence in their tools.
This high noise-to-signal ratio can lead to missed critical issues and burnout.
Challenge #3: The practice deficit
Another critical challenge is that organisations treat resilience exercises as checkbox activities rather than meaningful practice opportunities, leading to predictable failures when real incidents occur. Again, cyber security fails to adhere to the basic principles of skill development and maintenance that other professions take seriously.
Despite widespread adoption of resilience programs, with over 70% of organisations having established operational resilience programs by 2025 and an additional 10% developing one, the quality of practice remains problematic. Many large enterprises still report that outages typically take 60 minutes or more to resolve, suggesting that formal programs don't necessarily translate to effective response capabilities.
The BCI's Operational Resilience Report 2025 emphasises that adherence to best practices and regulatory compliance drives improvements in resilience, with regulated sectors such as financial services, healthcare, and government leading in disaster recovery maturity and investment. However, this compliance-driven approach often prioritises documentation over genuine capability building.
Dan observes the systematic degradation of resilience planning in practice. He points out that organisations rarely practice cyber resilience effectively, attributing this to the human learning decay curve. This decay curve effect is well-documented in educational psychology: without regular practice, even well-learned skills deteriorate rapidly.
He describes a common scenario:
A resilience exercise is scheduled, but due to other constraints, it often shrinks from a half-day to two hours, leading to attrition and ultimately diminishing learning returns.
Jane's experience participating in tabletop exercises reveals the depth of this problem. Describing a ransomware scenario, she notes: "There's a room full of cyber people here. And they're not getting every answer right. This is their job, and the fact that I was getting more right than they were was both unexpected and alarming."
A critical gap Jane identified during these exercises was the absence of "minimum viable recovery" planning. "What are your minimum viable business operations to get back within the first few hours? I've not heard many talk about that." This concept, borrowed from startup methodology, represents the essential services an organisation must restore immediately to remain operational.
Challenge #4: Misaligned incentives and accountability
Cyber security metrics and KPIs often reward activity over outcomes, creating perverse incentives that undermine genuine resilience while providing the illusion of progress — a misalignment that occurs both at individual and organisational levels.
Dan identifies what he terms "cosmetic metrics": measurements that feel good but lack intent. They don't have consequences and just serve as checkboxes on a list. These metrics satisfy reporting requirements without improving actual security posture.
The problem extends to individual accountability. As Jane notes, when security isn't embedded in performance objectives, it becomes someone else's problem — typically, the security team's alone.
However, when incentives are misaligned, the consequences can be severe.
The Joe Sullivan case illustrates the personal liability risks facing security leaders. Sullivan, former CSO at Uber, was convicted in 2022 for his handling of a 2016 data breach, with his conviction upheld on appeal in 2025. Despite consulting Uber's legal team and following some internal procedures, Sullivan was held criminally liable for concealing the breach from regulators through arrangements with the attackers. The case demonstrates how security leaders can face personal consequences even when believing they manage incidents appropriately within company structures.
This creates what organisational psychologists term "moral hazard," situations in which individuals face personal consequences for organisational failures beyond their control.
These four challenges interconnect and reinforce each other. Unstable foundations create cognitive overload, which reduces practice effectiveness, which encourages cosmetic metrics, which perpetuate unstable foundations. Breaking this cycle requires coordinated action across all four areas.
Solutions
Overcoming ineffective cyber resilience demands action across all four problem areas. Our proposed solutions aren't just theory but proven practices from aviation, psychology, and organisational development, tailored for cyber security.
Each solution tackles a specific challenge while strengthening the others, building true organisational immunity instead of just cosmetic compliance.

Solution #1: Implement the "Cyber Maslow Hierarchy"
Building resilient cyber security requires starting with leadership and culture before layering on GRC, defence (technology), and community collaboration efforts. This inverts current practice but aligns with successful models from other high-stakes industries.
Layer 1: Leadership foundation (physiological needs)
The leadership foundation layer of cyber security focuses on the basics like envisioning, mobilising, developing, and enabling. Included in this is a central mindset, which is what Jane refers to as "high support, high challenge”, and where your team is supported and challenged in equal doses.
Essentially, it’s about building psychologically safe teams, where your security team can ask questions, and they feel safe to do that. Read more about fostering a high-challenge, high-support cyber security culture.
Psychological safety research from Google demonstrates that team performance correlates more strongly with psychological safety than with individual talent. In cyber security contexts, this translates to teams that can openly discuss failures, question assumptions, and learn from incidents without fear of retribution.
Layer 2: Culture (safety needs)
Leadership lays the groundwork, and culture shapes the organisation's very structure. Jane emphasises that culture "embeds security into daily life" and transforms employees from "passive liabilities" to "active participants in defence."
Many organisations still treat cyber security as a compliance checklist when they should be embedding awareness and responsibility across the board.
“An effective culture prioritises continuous education, diversity of thought, and collaboration”, Jane notes.
Layer 3: GRC (belonging needs)
This layer addresses our need for connection and clarity. Jane explains that many businesses today "drown in data, bombarded with endless alerts, metrics, and dashboards. This overload leads to analysis paralysis, distracting teams from what matters most."
Gartner projects that by 2027, 75% of employees will be installing and using applications outside IT's visibility, including AI tools, up from 41% in 2022.
GRC must focus on what Jane calls ensuring "policies are there, but people actually know they're there." By making GRC simpler with clear numbers and helpful tips, your team can quickly make the right moves and avoid mistakes.
Layer 4: Defence (esteem needs)
Defence strategies “provide the confidence and trust that organisations need to function securely”, Jane explains.
Only after establishing solid people, culture, and GRC foundations should organisations focus heavily on technological defences.
At this layer, technology choices should prioritise cognitive liberation over comprehensive coverage. As Dan's aviation insight shows, technology should handle routine tasks so humans can focus on critical thinking.
Layer 5: Community (self-actualisation)
The highest level is community — people working together across different groups. Jane describes this as the layer that "unlocks the full potential of a cyber security strategy."
By teaming up with others in their industry, sharing threat intelligence, and partnering with outside groups, businesses not only get more secure but also help build a safer digital world for everyone.
Read more about the benefits of building a collaborative cyber security environment.
Solution #2: Design for cognitive liberation
Reducing cognitive load requires applying human factors engineering principles to cyber security tool design and operational procedures. The goal is to free mental capacity for higher-order thinking rather than overwhelming practitioners with data.
Practical implementation:
Identify essential metrics — Following aviation's "essential six" model, organisations should identify the minimum information needed for security decision-making. This typically includes threat indicators, system health status, and incident progression metrics.
Design exception-based alerts — Instead of comprehensive monitoring, systems should highlight only genuine anomalies requiring human attention, reducing information overload.
Implement progressive disclosure — Present summary information by default with detailed data available on demand. This allows operators to maintain situational awareness without cognitive overwhelm.
Jane's concept of "minimum viable recovery" operations provides another cognitive liberation tool. By pre-defining essential services, teams can focus on critical decisions rather than on comprehensive analysis during incidents.
Solution #3: Institutionalise meaningful practice
Effective resilience requires regular, realistic practice that tests genuine decision-making capabilities rather than theoretical knowledge. Organisations should treat exercises as skill-building opportunities rather than compliance activities.
Framework for effective practice:
Define realistic scenarios. Base exercises on actual threat intelligence and organisational vulnerabilities rather than generic templates.
Ensure senior participation. Without leadership participation, exercises lack realism and authority.
Focus on decision points. Rather than testing procedural knowledge, exercises should force participants to make trade-offs and prioritisation decisions under pressure.
Capture learning systematically. Dan observes that "the human element is the most important: we're going to fail and err like mad, and therein lie the gaps, the learning that drives remediation."
Practice frequency. Moving from annual disasters to monthly skill-building sessions addresses the "decay curve".
Solution #4: Align incentives with outcomes
Creating organisational contexts that support rather than punish effective security decision-making requires redesigning accountability systems and career incentives.
Individual level changes
Microsoft's Secure Future Initiative provides a great counter-example to typical practice. The company has integrated security into every employee's performance evaluation, with one-third of senior leaders' bonuses tied directly to cyber security objectives. And this extends beyond leadership: all employees participate in biannual reviews that include discussions of their cyber security contributions, moving security from a compliance checkbox to a core element of personal accountability.
Leadership level support
As Jane notes, the emergence of CISO insurance addresses personal liability concerns:
"Just like you get directors insurance, you can now get insurance for CISOs, which is a good thing."
This risk mitigation enables leaders to make difficult but necessary decisions without bearing unlimited personal consequences.
Professional certification provides another form of protection by establishing standards for competent practice. “Leaders are being insured because they've been through the training. We can underwrite that as a risk because we believe we've given you the foundations," she explains.
Organisational culture changes
Creating contexts where "doing the right thing" doesn't become "a career-limiting move" requires cultural change at the board level. This includes setting realistic expectations about cyber security outcomes and supporting leaders who surface uncomfortable truths about organisational vulnerabilities.
These four solutions work together to create what Jane calls the foundation for genuine resilience. As she notes, organisations must "slow down and pause to think decisions through" — something only possible when the right structural supports are in place.
The solutions move organisations from reactive, technology-centric approaches to proactive, human-centred resilience that can adapt to threats not yet imagined.
Key takeaways
Effective cyber resilience requires organisations to recognise that their greatest security asset is human wisdom, properly supported and thoughtfully deployed. The following takeaways provide a roadmap for this transformation:
Foundations first. Start with leadership and culture before layering on GRC and technology. Organisations that begin with technical solutions while ignoring human factors will remain vulnerable regardless of their technological sophistication.
Cognitive liberation over information overload. The goal of cyber security technology should be to free human cognitive capacity for higher-order thinking and decision-making, not to overwhelm practitioners with data they cannot effectively process or act upon.
Practice makes perfect. Resilience is a skill that degrades without regular practice. Organisations must institutionalise realistic, frequent exercises that test genuine decision-making capabilities rather than theoretical knowledge of procedures.
Context determines success. There is no universal definition of cyber resilience. Each organisation must define what good looks like based on their specific risk appetite, business model, and operating context whilst building the cultural foundations that enable effective security practice.
Aligned incentives drive behaviour. Security cannot remain someone else's responsibility. Embedding security objectives into individual performance evaluations and organisational KPIs transforms it from a compliance checkbox to a core element of personal and professional accountability.
The human advantage in an automated world. As artificial intelligence accelerates business pace and cyber threats, people must focus on maintaining creativity, intuition, ethical reasoning, and the wisdom to pause and think critically rather than simply react at machine speed.
The organisations that embrace these principles and act upon them will not only survive the next generation of cyber threats but will thrive while competitors struggle with the consequences of their technology-first approaches.
Chaleit partners with leaders who understand that effective cyber security starts with people, not just technology. Our approach combines strategic thinking with practical implementation to create truly resilient organisations. Contact us — we’re ready to listen.
About the authors
Jane Frankland MBE
Jane Frankland MBE is a thought leader in cyber security and technology, celebrated for her impactful collaborations with top brands and governments. She made history by founding the first female-owned global hacking firm in the 1990s, paving the way for women's representation in a traditionally male-dominated field.
Her work has played a pivotal role in launching ground-breaking initiatives such as CREST, Cyber Essentials, and Women4Cyber, demonstrating her leadership and pioneering efforts in advancing security and promoting diversity. With prestigious accolades to her name and a successful career, including her role as Managing Director at Accenture, Jane is not only a seasoned professional but also an author of the bestselling book "IN Security" and associated movement, which has empowered more than 442 women through scholarships worth $800,000.
Her insights have reached millions through renowned media outlets like The Sunday Times, BBC, The Guardian, and Forbes. As a sought-after speaker at global events, including the EU Commission and UN Women, Jane continues to inspire aspirations across the tech community. Presently, as the CEO of KnewStart,
Jane harnesses her expertise as an advisor and brand ambassador, promoting brands' innovation, and ensuring that her remarkable journey leaves a lasting impact in the field of cybersecurity.
Dan Haagman
Dedicated to strategic cyber security thinking and research, Dan Haagman is the CEO and founder of Chaleit, a seasoned leader in global cyber security consulting, and an Honorary Professor of Practice at Murdoch University in Perth, Australia.
With nearly 30 years of experience, he began his journey at The London Stock Exchange, where he pioneered the development of their first modern SOC and defensive team. As a co-founder of NotSoSecure and 7Safe, both acquired by reputable firms, Dan has left a lasting impact on the industry.
Today, Dan leads a team of brilliant minds in seven countries, all focused on delivering world-class cyber security consulting. Chaleit reflects Dan's vision for the industry's future. Built on the core principles of longevity and transparency, the company is poised for a public offering within the next few years.
Dan has a passion for learning. With a pen and paper at hand, he dedicates significant time to reading, researching, designing systems, and learning with clients and peers with the goal of being a leading thinker and collaborator in the cyber industry.
Disclaimer
The views expressed in this article represent the personal insights and opinions of Dan Haagman and Jane Frankland. Dan Haagman's views also reflect the official stance of Chaleit, while Jane Frankland's views are her own and do not necessarily represent the official position of her organisation. Both authors share their perspectives to foster learning and promote open dialogue.