The psychology of cybersecurity: adopt good password habits and cybersecurity practices and you'll become a better version of yourself. Part 1

Poor password and cybersecurity practices stem from our cognitive biases. Fighting the former, you’re fighting the latter! (and vice versa)
We have all heard a lot about cognitive biases, but most often we are shown examples from everyday life and human relationships, while biases permeate all areas. In our article, we decided to complement the knowledge about cognitive biases with examples from cybersecurity, especially the management of passwords, other access data and sensitive information.
So what biases can directly influence our cybersecurity behaviour and password management practices?
These include
- Dunning-Kruger effect
- Conservatism bias and Anchoring
- Reactance
- Confirmation bias and its effects
- Belief bias
- Availability cascade and Bandwagon effect
- Status quo bias
- Logical fallacy
- Framing effect
- Survivorship bias
- Law of triviality
- Effort justification
- Pessimism/optimism effect
- Egocentric bias
Below, we will discuss first seven of these biases in detail and how we can counteract them.
Dunning-Kruger effect
This is a cognitive bias that results in an overestimation of one's own competence by a person who has only fragmentary knowledge of the subject.
How this might affect your password management and cybersecurity practices:
If you know about cyber threats, you probably try to minimise them. For example, if you know about phishing, you might not click on links in suspicious emails. But you may not consider other types of threats, thinking that it is enough to be vigilant about emails. But there is also a type of phishing called quishing, where malware or links to phishing sites are distributed through QR codes.
Or you know the rules for creating strong passwords, but you create a password "P@ssword123!" which, despite all the attributes of a strong password, is predictable and easily cracked.
How to mitigate this bias:
Unfortunately, there is no simple practice that will allow us to quickly overcome this effect. We are proud of ourselves when we learn something new, but we must always remember that no matter how much we learn, there is always information that we do not know.
Therefore, we must gradually build up our knowledge (for example, by subscribing to a page on social networks that popularises cybersecurity) and refresh it (by taking free tests or training courses).
We must try to assess our skills objectively and not neglect the advice of experts.
Conservatism bias and Anchoring
Anchoring means that people tend to rely on the first piece of information they receive on a topic. And new information is processed through the lens of that first piece of information, rather than objectively. Conservatism bias means that we resist changing our opinions in response to new information, i.e. we prefer to believe our past beliefs.
Negative impact:
For example, you may have seen a news story about a data leak in a well-known password manager. As a result, you start to think that password managers are unreliable and your trust in them diminishes. Even though using them reduces cyber risks and password managers are, on average, much more secure than regular applications that do not focus on security.
How to mitigate this bias:
Anchoring is a very powerful cognitive bias that is impossible to get rid of completely. If you have found an anchor in yourself, an effective strategy to reduce its influence is to find or invent arguments why it is wrong and not true.
In the context of our example, this could be a comparison of statistics on the industries in which data breaches occur (and IT industry will be far from first place), or research on users with certain password management habits (those who use password managers are still at lower risk then those who don’t).
Reactance bias
The cognitive bias of reactance means that we tend to do the opposite of what we think others are trying to force us to do, in order to demonstrate our freedom to do what we want.
Negative impact:
We often see repeated recommendations about privacy and cybersecurity. But even though these recommendations are published with good intentions, we often ignore them out of reactance, even understanding that they might be on point. Simply because we don't like being told what to do.
How to mitigate this bias:
A good way to reduce reactance is to find a role model with the values you want; having a good example in front of you makes it easier to follow good practice.
Break the process of getting used to new practices into small steps and move forward gradually.
Confirmation bias
This may seem similar to conservatism, but it differs in that confirmation bias causes us to focus on information that confirms our beliefs and interpret the information in favour of those beliefs or preconceptions.
We can also include a backfire effect, where we reinforce our beliefs when we encounter information that contradicts them. And a selective perception effect, which means that our perceptions are influenced by our expectations.
Negative impact:
Let's say for some reason you believe that using a password manager or antivirus is pointless. In light of this belief, any articles, posts, videos that talk about the benefits of using these applications will seem like marketing gimmicks.
How to mitigate this bias:
Unfortunately, the only way to combat this bias is to consciously research the area of knowledge affected by it. Self-reflection and communication with someone knowledgeable in the field will also be beneficial.
Belief bias
Belief bias occurs when people accept arguments that fit their existing beliefs and reject those that contradict them, even if the logic is flawed. In short, they judge something to be true or false based on what they already believe, rather than on actual facts or reasoning.
Negative impact:
Many people tend to believe that hackers only target big companies, not ordinary people. Even when presented with facts about how many ordinary people are cyber-attacked and lose their data and money, they are less likely to accept these facts than facts about cyber-attacks on large companies or government agencies.
How to mitigate this bias:
One effective way of overcoming belief bias is to give yourself time to consider arguments that contradict your opinion. Don't dismiss them out of hand, but think about them carefully and compare them with arguments that support your opinion, or find out more about them.
Also, when arguments are presented in a negative light, we tend to analyse them more carefully than when they are presented neutrally.
Availability cascade and Bandwagon effect
The availability cascade occurs when we meet other people with the same opinion as us, which reinforces our own belief in something (not necessarily something true). And a bandwagon effect happens when we gravitate towards a group opinion.
Negative impact:
For example, the previously popular belief that passwords should be changed frequently led to users simply changing the last character of a password (often a weak one) and thinking that was enough. This policy was enforced in organisations. There is currently no clear consensus on the frequency of password changes, but it is better to create a long, strong and unique password and monitor it for leakage to the darknet via a password manager than to change it frequently.
How to mitigate this bias:
Group opinion (especially non-expert opinion) is not necessarily always correct. When in doubt, it is better to study the opinions of experts. Unfortunately, if such practices are part of the policy of the company you work for, you have less freedom of action: either accept and follow them, or work with the IT department to advocate more effective practices.
Status quo bias
Status quo bias occurs when you resist change in order to maintain your current position.
Negative impact:
Say you want to start using a password manager, but because you have to add, organise and label all your passwords and other access data first, your brain sabotages the transition with status quo bias: why start using a password manager if your accounts haven't been hacked or your data hasn't been leaked?
The sheer number of password managers on the market can also put you off using one, as it takes time and effort to compare alternatives.
How to mitigate this bias:
In this situation, it is good practice to consciously find arguments in favour of the new useful initiative you want to start, weighing up all the potential advantages and disadvantages of the new practice against those of the current practice.
Try to reduce the number of choices. And once you have chosen, make a plan for implementing the new practice and move step by step with the end goal in mind.
We hope this article has helped you understand yourself a little better and motivated you to improve your cybersecurity posture. The second half of the biases will be discussed in the next article.