{"id":24502,"date":"2025-08-11T08:30:11","date_gmt":"2025-08-11T12:30:11","guid":{"rendered":"https:\/\/me-en.kaspersky.com\/blog\/?p=24502"},"modified":"2025-08-11T16:40:41","modified_gmt":"2025-08-11T12:40:41","slug":"no-blame-cybersecurity-culture","status":"publish","type":"post","link":"https:\/\/me-en.kaspersky.com\/blog\/no-blame-cybersecurity-culture\/24502\/","title":{"rendered":"No blame: how psychological safety helps improve cybersecurity"},"content":{"rendered":"<p>Even companies with a mature cybersecurity posture and significant investments into data protection aren\u2019t immune to cyber-incidents. Attackers can exploit\u00a0<a href=\"https:\/\/securelist.com\/cve-2018-8453-used-in-targeted-attacks\/88151\/\" target=\"_blank\" rel=\"noopener\">zero-day vulnerabilities<\/a>\u00a0or <a href=\"https:\/\/securelist.com\/gopuram-backdoor-deployed-through-3cx-supply-chain-attack\/109344\/\" target=\"_blank\" rel=\"noopener\">compromise a\u00a0supply chain<\/a>. Employees can <a href=\"https:\/\/thehackernews.com\/2025\/06\/fbi-warns-of-scattered-spiders.html\" target=\"_blank\" rel=\"nofollow noopener\">fall victim to sophisticated scams<\/a> designed to breach the company\u2019s defenses. The cybersecurity team itself can make a mistake <a href=\"https:\/\/securelist.com\/compromise-assessment-cases\/114332\/\" target=\"_blank\" rel=\"noopener\">while configuring security tools<\/a>, or during an <a href=\"https:\/\/securelist.com\/incident-response-interesting-cases-2023\/110492\/\" target=\"_blank\" rel=\"noopener\">incident response procedure<\/a>. However, each of these incidents represents an opportunity to improve processes and systems, making your defenses even more effective. This isn\u2019t just a rallying call; it\u2019s a practical approach that\u2019s been successful enough in other fields such as aviation safety.<\/p>\n<p>In aviation, <a href=\"https:\/\/www.ecfr.gov\/current\/title-14\/chapter-I\/subchapter-A\/part-5\" target=\"_blank\" rel=\"nofollow noopener\">almost everyone in the aviation industry<\/a> \u2014 from aircraft design engineers to flight attendants \u2013 is required to share information to prevent incidents. This isn\u2019t limited to crashes or system failures; the industry also reports potential problems. These reports are constantly analyzed, and security measures are adjusted based on the findings. According to <a href=\"https:\/\/commercial.allianz.com\/news-and-insights\/expert-risk-articles\/how-aviation-safety-has-improved.html\" target=\"_blank\" rel=\"nofollow noopener\">Allianz Commercial\u2019s statistics<\/a>, this continuous implementation of new measures and technologies has led to a significant reduction in fatal incidents \u2014 from 40 per million flights in 1959 to 0.1 in 2015.<\/p>\n<p>Still in aviation, it was recognized long ago that this model simply won\u2019t work if people are afraid to report procedure violations, quality issues, and other causes of incidents. That\u2019s why aviation standards include requirements for\u00a0<a href=\"https:\/\/skybrary.aero\/articles\/national-reporting-systems\" target=\"_blank\" rel=\"nofollow noopener\">non-punitive reporting<\/a>\u00a0and a\u00a0<a href=\"https:\/\/eur-lex.europa.eu\/legal-content\/EN\/TXT\/?uri=celex:32014R0376\" target=\"_blank\" rel=\"nofollow noopener\">just culture<\/a>, meaning that reporting problems and violations shouldn\u2019t lead to punishment. DevOps\u00a0engineers have a similar principle they call a <a href=\"https:\/\/www.pluralsight.com\/resources\/blog\/tech-operations\/how-conduct-blameless-postmortems-incident\" target=\"_blank\" rel=\"nofollow noopener\">blameless culture<\/a>, which they use when analyzing major incidents. This approach is also essential in\u00a0cybersecurity.<\/p>\n<h2>Does every mistake have a name?<\/h2>\n<p>The opposite of a blameless culture is the idea that \u201cevery mistake has a name\u201d, meaning a specific person is to blame. Under this approach, every mistake can lead to disciplinary action, including termination. This principle is considered harmful and doesn\u2019t lead to better security.<\/p>\n<ul>\n<li>Employees fear accountability and tend to <a href=\"https:\/\/web.archive.org\/web\/20220329080055\/https:\/www.tessian.com\/resources\/psychology-of-human-error-2022\/\" target=\"_blank\" rel=\"nofollow noopener\">distort facts during incident investigations<\/a> \u2014 or even destroy evidence.<\/li>\n<li>Distorted or partially destroyed evidence complicates the response and worsens the overall outcome because security teams can\u2019t quickly and properly assess the scope of a given incident.<\/li>\n<li>Zeroing in on one person to blame during an incident review prevents the team from focusing on how to change the system to prevent similar incidents from happening again.<\/li>\n<li>Employees are afraid to report violations of IT and security policies, causing the company to miss opportunities to fix security flaws <em>before<\/em> they lead to a critical incident.<\/li>\n<li>Employees have no motivation to discuss cybersecurity issues, coach one another, or correct their coworkers\u2019 mistakes.<\/li>\n<\/ul>\n<p>To truly enable every employee to contribute to your company\u2019s security, you need a different approach.<\/p>\n<h2>The core principles of a just culture<\/h2>\n<p>Call it \u201cnon-punitive reporting\u201d or a \u201cblameless culture\u201d \u2014 the core principles are the same:<\/p>\n<ul>\n<li>Everyone makes mistakes. We learn from our mistakes; we don\u2019t punish them. However, it\u2019s crucial to distinguish between an honest mistake and a malicious violation.<\/li>\n<li>When analyzing security incidents, the overall context, the employee\u2019s intent, and any systemic issues that may have contributed to the situation all need considering. For example, if a high turnover of seasonal retail employees prevents them from being granted individual accounts, they might resort to sharing a single login for a point-of-sale terminal. Is the store administrator at fault? Probably not.<\/li>\n<li>Beyond just reviewing technical data and logs, you must have in-depth conversations with everyone involved in an incident. For this you should create a productive and safe environment where people feel comfortable sharing their perspectives.<\/li>\n<li>The goal of an incident review should be to improve behavior, technology, and processes in the future. Regarding the latter for serious incidents, they should be split in to two: <em>immediate response<\/em> to mitigate the damage, and <em>postmortem analysis<\/em> to improve your systems and procedures.<\/li>\n<li>Most importantly, be open and transparent. Employees need to know how reports of issues and incidents are handled, and how decisions are made. They should know exactly who to turn to if they see or even suspect a security problem. They need to know that both their supervisors and security specialists will support them.<\/li>\n<li>Confidentiality and protection. Reporting a security issue should not create problems for the person who reported it or for the person who may have caused it \u2014 as long as both acted in good faith.<\/li>\n<\/ul>\n<h2>How to implement these principles in your security culture<\/h2>\n<p><strong>Secure leadership buy-in.<\/strong> A security culture doesn\u2019t require massive direct investment, but it does need consistent support from the HR, information security, and internal communications teams. Employees also need to see that top management actively endorses this approach.<\/p>\n<p><strong>Document your approach.<\/strong> The blameless culture philosophy should be captured in your company\u2019s official documents \u2014 from detailed security policies to a simple, short guide that every employee will actually read and understand. This document should clearly state the company\u2019s position on the difference between a mistake and a malicious violation. It should formally state that employees won\u2019t be held personally responsible for honest errors, and that the collective priority is to improve the company\u2019s security, and prevent future recurrences.<\/p>\n<p><strong>Create channels for reporting issues.<\/strong> Offer several ways for employees to report problems: a dedicated section on the intranet, a specific email address, or the option to simply tell their immediate supervisor. Ideally, you should also have an anonymous hotline for reporting concerns without fear.<\/p>\n<p><strong>Train employees. <\/strong>Training helps employees recognize insecure processes and behaviors. Use real-world examples of problems they should report, and walk them through different incident scenarios. You can use our online <a href=\"https:\/\/k-asap.com\/en\/?icid=me-en_kdailyplacehold_acq_ona_smm__onl_b2b_kasperskydaily_wpplaceholder____kasap___\" target=\"_blank\" rel=\"noopener\">our online Kaspersky Automated Security Awareness Platform<\/a> to organize these cybersecurity-awareness training sessions. Motivate employees to not only report incidents, but also to suggest improvements and think about how to prevent security problems in their day-to-day work.<\/p>\n<p><strong>Educate your leadership.<\/strong> Every manager needs to understand how to respond to reports from their team. They need to know how and where to forward a report, and how to avoid creating blame-focused islands in a sea of just culture. Teach leaders to respond in a way that makes their coworkers feel supported and protected. Their reactions to incidents and error reports needs to be constructive. Leaders should also encourage discussions of security issues in team meetings to normalize the topic.<\/p>\n<p><strong>Develop a fair review procedure<\/strong> for incidents and security-issue reports. You\u2019ll need to assemble a diverse group of employees from various teams to form a \u201cno-blame review board\u201d. It will be responsible for promptly processing reports, making decisions, and creating action plans for each case.<\/p>\n<p><strong>Reward proactivity.<\/strong> Publicly praise and reward employees who report spearphishing attempts or newly discovered flaws in policies or configurations, or who simply complete awareness training better and faster than others on their team. Mention these proactive employees in regular IT and security communications such as newsletters.<\/p>\n<p><strong>Integrate findings into your security management processes.<\/strong> The conclusions and suggestions from the review board should be prioritized and incorporated into the company\u2019s <a href=\"https:\/\/www.kaspersky.com\/blog\/cyber-resilience-101\/53464\/\" target=\"_blank\" rel=\"noopener nofollow\">cyber-resilience plan<\/a>. Some findings may simply influence risk assessments, while others could directly lead to changes in company policies, or implementation of new technical security controls or reconfiguration of existing ones.<\/p>\n<p><strong>Use mistakes as learning opportunities.<\/strong> Your <a href=\"https:\/\/www.kaspersky.com\/blog\/vr-interactive-simulation\/40188\/\" target=\"_blank\" rel=\"noopener nofollow\">security awareness program<\/a> will be more effective if it uses real-life examples from your own organization. You don\u2019t need to name specific individuals, but you can mention teams and systems, and describe attack scenarios.<\/p>\n<p><strong>Measure performance.<\/strong> To ensure this process is working and delivering results, you need to use information security metrics as well as HR and communications KPIs. Track the <a href=\"https:\/\/encyclopedia.kaspersky.com\/glossary\/mean-time-to-respond-mttr\/\" target=\"_blank\" rel=\"noopener\">MTTR<\/a> for identified issues, the percentage of issues discovered through employee reports, employee satisfaction levels, the number and nature of security issues identified, and the number of employees engaged in suggesting improvements.<\/p>\n<h2>Important exceptions<\/h2>\n<p>A security culture or blameless culture doesn\u2019t mean that no one is ever held accountable. Aviation safety documents on non-punitive reporting, for example, include crucial exceptions. Protection doesn\u2019t apply when someone knowingly and maliciously deviates from the regulations. This exception prevents an insider who has leaked data to competitors from enjoying complete impunity after confessing.<\/p>\n<p>The second exception is when national or industry regulations require individual employees to be held personally accountable for incidents and violations. Even with this kind of regulation, it\u2019s vital to maintain balance. The focus should remain on improving processes and preventing future incidents \u2014 \u00a0not on finding who\u2019s to blame. You can still build a culture of trust if investigations are objective and accountability is only applied where it\u2019s truly necessary and justified.<\/p>\n<input type=\"hidden\" class=\"category_for_banner\" value=\"kasap\">\n","protected":false},"excerpt":{"rendered":"<p>Companies need to build a culture of security, but this is impossible when employees are afraid to discuss incidents or suggest improvements.<\/p>\n","protected":false},"author":2722,"featured_media":24503,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"footnotes":""},"categories":[1318,1916,1917],"tags":[1457,1948,2667,2212,1813,346,1366,2494,131],"class_list":{"0":"post-24502","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-business","8":"category-enterprise","9":"category-smb","10":"tag-business","11":"tag-ciso","12":"tag-cyber-resilience","13":"tag-cybersecurity-awareness","14":"tag-economics","15":"tag-education","16":"tag-security-awareness","17":"tag-strategy","18":"tag-tips"},"hreflang":[{"hreflang":"en-ae","url":"https:\/\/me-en.kaspersky.com\/blog\/no-blame-cybersecurity-culture\/24502\/"},{"hreflang":"en-in","url":"https:\/\/www.kaspersky.co.in\/blog\/no-blame-cybersecurity-culture\/29388\/"},{"hreflang":"ar","url":"https:\/\/me.kaspersky.com\/blog\/no-blame-cybersecurity-culture\/12717\/"},{"hreflang":"en-gb","url":"https:\/\/www.kaspersky.co.uk\/blog\/no-blame-cybersecurity-culture\/29336\/"},{"hreflang":"es-mx","url":"https:\/\/latam.kaspersky.com\/blog\/no-blame-cybersecurity-culture\/28421\/"},{"hreflang":"es","url":"https:\/\/www.kaspersky.es\/blog\/no-blame-cybersecurity-culture\/31299\/"},{"hreflang":"ru","url":"https:\/\/www.kaspersky.ru\/blog\/no-blame-cybersecurity-culture\/40262\/"},{"hreflang":"tr","url":"https:\/\/www.kaspersky.com.tr\/blog\/no-blame-cybersecurity-culture\/13679\/"},{"hreflang":"x-default","url":"https:\/\/www.kaspersky.com\/blog\/no-blame-cybersecurity-culture\/54075\/"},{"hreflang":"fr","url":"https:\/\/www.kaspersky.fr\/blog\/no-blame-cybersecurity-culture\/23080\/"},{"hreflang":"de","url":"https:\/\/www.kaspersky.de\/blog\/no-blame-cybersecurity-culture\/32564\/"},{"hreflang":"ru-kz","url":"https:\/\/blog.kaspersky.kz\/no-blame-cybersecurity-culture\/29553\/"},{"hreflang":"en-au","url":"https:\/\/www.kaspersky.com.au\/blog\/no-blame-cybersecurity-culture\/35254\/"},{"hreflang":"en-za","url":"https:\/\/www.kaspersky.co.za\/blog\/no-blame-cybersecurity-culture\/34902\/"}],"acf":[],"banners":"","maintag":{"url":"https:\/\/me-en.kaspersky.com\/blog\/tag\/cybersecurity-awareness\/","name":"cybersecurity awareness"},"_links":{"self":[{"href":"https:\/\/me-en.kaspersky.com\/blog\/wp-json\/wp\/v2\/posts\/24502","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/me-en.kaspersky.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/me-en.kaspersky.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/me-en.kaspersky.com\/blog\/wp-json\/wp\/v2\/users\/2722"}],"replies":[{"embeddable":true,"href":"https:\/\/me-en.kaspersky.com\/blog\/wp-json\/wp\/v2\/comments?post=24502"}],"version-history":[{"count":1,"href":"https:\/\/me-en.kaspersky.com\/blog\/wp-json\/wp\/v2\/posts\/24502\/revisions"}],"predecessor-version":[{"id":24504,"href":"https:\/\/me-en.kaspersky.com\/blog\/wp-json\/wp\/v2\/posts\/24502\/revisions\/24504"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/me-en.kaspersky.com\/blog\/wp-json\/wp\/v2\/media\/24503"}],"wp:attachment":[{"href":"https:\/\/me-en.kaspersky.com\/blog\/wp-json\/wp\/v2\/media?parent=24502"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/me-en.kaspersky.com\/blog\/wp-json\/wp\/v2\/categories?post=24502"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/me-en.kaspersky.com\/blog\/wp-json\/wp\/v2\/tags?post=24502"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}