It’s no secret that protecting corporate networks by implementing security best practices and policies is critically important to protecting (and guiding) users. It can take days, weeks and even months to set up the proper security solutions to help meet those goals. And even after all that, some organizations still experience breaches from silly mistakes made by users. But not all the blame falls on the user base. Security professionals are often missing an important element of security — mitigating normalcy bias.
Normalcy bias: A cybersecurity threat
Normalcy bias is a cognitive bias that leads people to disbelieve or minimize threat warnings. Consequently, individuals underestimate the likelihood of a disaster that might affect them. This is extremely applicable when thinking about cybersecurity and users. How do cybersecurity professionals balance a user base that includes those that prepare for the worst-case scenario (also known as preppers) and those that don’t (non-preppers)? Preppers often overestimate the likelihood of an apocalyptic event and suffer from worst-case thinking bias, but non-preppers easily dismiss the need to prepare for an event. When applied to cyber threats and the need to secure an organization from a breach (or other threats like phishing, etc.), normalcy bias can have a heavy impact on the execution of employee best practices.
The unfortunate fact today is that users often understand the likelihood of a security event happening (such as a breach), but they fail to see how their actions might cause one. They don’t intend to help cause a breach, but normalcy bias allows them to believe that the actions they take won’t contribute to a negative security event. Normalcy bias also leaves users with the belief that if an event does occur, it won’t cause much damage — essentially allowing them to diminish the severity of a possible security event outcome. The reality is that users base their actions on how often they see and experience something, instead of how often something actually happens. This “user error” is a big contributing factor in security breaches.
The pervasiveness of normalcy bias
What can lead to this behavior? Often excessive warnings lead users to ignore them (and jeopardize safety). For example, when’s the last time you read the medication warning on a bottle of acetaminophen? Or noticed the temperature warning by the coffee dispenser in a gas station?
Shifting this to organizational security, how often do users accept the updated Facebook privacy policy without reading it all the way through or read the “last login” info after connecting to a Linux shell? The sheer number of warnings users encounter daily leads many to automatically diminish the severity of the next one. The threat becomes normalized. These excessive warnings often come from a focus on protecting the creator of the warning from being held responsible instead of helping the user avoid pitfalls.
So how do organizations work to overcome normalcy bias to help improve overall security efficacy within their user bases? There are two key elements: education and security solutions. Here are three tips to consider when looking to educate an organization:
- When creating security policies, cybersecurity leaders must not prevent productivity. For example, policies that block users from changing the desktop background tend to hinder productivity and create a disconnect between the user and the company, thus increasing normalcy bias. Security professionals also can’t dismiss the end goal of company growth in the name of cybersecurity. If security teams prevent growth, then they aren’t helping anyone.
- Conduct quarterly training that focuses on user ability to prevent the latest threats facing the organization and the impact of user error. Embed a security-first mindset into the corporate culture starting from the top. Offer users educational materials that allow them to understand the problem and the role they play. Share real-world examples and encourage users to do the same. Ensure leadership sets good examples and advocates for best practices. Inform the users of their own importance in keeping the company secure. No one likes to admit they’ve made a security mistake. That’s why organizations need to encourage users to report errors they see or make. And once an error has happened, there needs to be follow-up with the user to ensure they understand the problem and know how to avoid it moving forward.
- In many information technology (IT) and development environments, employees have tight deadlines to complete projects. From the perspective of the user, they must complete the project in the timeline provided. They also need to balance the project with security protocols and, if not given enough time, security is often the bypassed element. Regardless of job function, organizations need to build in the proper amount of time for security policies and technologies to be used. This often means that managers and unit team leaders need to be informed of the impact of cybersecurity policies and account for security training for their teams.
Normalcy bias is often just chalked up to the need for better training, and while that is critical, it’s much deeper than that. Warnings should be designed to help the user, not just to protect the provider or vendor from liability. Eliminating normalcy bias means making a cultural shift within the organization that allows users to be the solution instead of the problem. This means making them an active part of the security strategy and arming them with the best practices, education and training so they can work to proactively help protect their organization.
Organizations may feel the urge to provide warnings on every possible point of danger, but this diminishes the bigger problems. Users and security have a complex relationship (along with the human element involved), and mitigating normalcy bias is just one element in an organization's overall security strategy. By talking about it, the security community can work together to help better address the challenges it presents.