Human Factors in a Crisis - Part 3
In the last few blogs I’ve been covering our observations on the ‘human factors’ that affect how a team or individual performs during a crisis. I’ve looked at the effects of stress and how we can draw on several types of behavioural approaches in a crisis. In this blog I’d like to cover the final aspects - confirmation bias and ‘group think’.
The first thing to appreciate is that us humans are not very good at having a balanced, logical view of the world around us. We tend to jump to conclusions without looking at all the facts and give more weight to evidence that supports a decision or theory we have made than any facts to the contrary. This is called confirmation bias.
Let’s take a person who is nervous about flying, who thinks that planes are a dangerous way of travelling. Whenever that person sees a news report about a plane crash then that will be magnified in their minds. It will completely outweigh the statistics that air travel is one of the safest forms of travel (research puts the chances of a commercial air accident at 1 in 7 million, train travel at 1 in a million whereas car travel in the UK has a risk of accident at about 1 in 20,000).
Once we have made our mind up we give much more weight to information, theories and stories that confirm our belief than those that go against it. And we see this everyday, whether it is in our beliefs in the efficacy of vaccines, social media stories, views on whether gods exist, to whether the referee in the Wales against England rugby match was right or wrong! In a crisis management context it can mean that we don’t appropriately process incoming information about an emerging crisis. We might hear several pieces of information, but give much more significance to one piece that confirms what we already believe, potentially ignoring valuable situational intelligence.
Confirmation bias is bad enough on its own, but consider what happens when we combine it with ‘group think’. In his book ‘Rebel Ideas’ Matthew Syed discusses how the CIA missed the warning signs that Osama Bin Laden was becoming a dangerous radical leader pre 9/11. One of the reasons for this, according to research, was that the CIA was, at the time, made up almost entirely of white, university-educated males. Their background meant that they had no way of understanding how, as they saw it, a ‘preacher in a cave’ could represent a threat to the security of the USA. This isn’t just about race or gender diversity - in order to understand a threat appropriately a team needs diversity of thought, experience and approaches.
In your next crisis exercise take a look at the way the crisis teams makes decisions. Are they diverse in thought? Too often team members simply go with whatever the most senior person decides, and whilst that might be the right decision most of the time, is there an ability for one person to say “Stop!” We are making the wrong decision!”? It takes strength of character and a well-timed intervention to change the views of a group that has already made up its mind.