Statistics is a useful tool for understanding the patterns in the world around us. But our intuition often lets us down when it comes to interpreting those patterns. In this series we look at some of the common mistakes we make and how to avoid them when thinking about statistics, probability and risk.
The world can feel like a scary place.
Today, Australia’s National Terrorism Threat Level is “Probable”. Shark attacks are on the rise; the number of people attacked by sharks in 2000-2009 has almost doubled since 1990-1999. Travellers are at a high risk of getting the Zika virus in places where the disease is present, such as Brazil and Mexico.
However, despite their tragic outcomes, these events are all extremely rare.
Since 1996, only eight people have been killed by terrorism attacks in Australia. There have been 186 shark attacks in the 20 years from 1990 to 2009. Best estimates indicate that only 1.8 people for each million tourists would have contracted Zika at the Rio Olympics.
To be fair, it is extremely difficult to judge the incidence of rare events. So how should we think about these risks?
Decision scientists study rare events by bringing people into the lab and asking them to make choices. For example, in their Nobel Prize-winning work, researchers Daniel Kahneman and Amos Tversky had people make choices between two options: one safe, one risky.
A typical choice might involve a safe option where you’d walk away with $5, guaranteed. Alternatively, you could choose to take a gamble and receive $15 with 90% probability. However, if you lost the gamble, you would have to pay $35.
If you’d just take the $5, then you’re not alone. Despite the gamble being clearly better than taking $5, in terms of what you would win on average (0.9 x $15 – 0.1 x $35 = $10), the loss of $35 looms so large in the mind that many of us tend to choose the safe option.
In this scenario, the loss of $35 is a relatively rare event: it will only occur 10% of the time. Yet we treat the rare event as if it were much more likely to occur than in reality. Kahneman and Tversky termed this the “overweighting” of small probabilities.
Of course, real-world rare events, such as disease control, shark attacks and terrorism threats, are much more complex than this fictitious gamble. But from a purely statistical point of view, it may be that we are disproportionately worried about such events, given their rarity.
For example, a poll conducted by Chapman University in the United States suggests that 38.5% of people were “afraid” or “very afraid” of being a victim of terrorism. This is despite the fact that only 71 people in the US were killed by terrorism between 2005 and 2015. To put that into perspective, PolitiFact reports that 301,797 people have died from gun violence in the US over a similar period.
So is it fear that drives us to believe that rare events are likely to happen?
According to David Landy, a researcher at Indiana University, who spoke on this very issue at the 2016 meeting of the Society for Judgment and Decision Making, the answer is no.
One question in Landy’s survey asked people to estimate the proportion of the US population that was Muslim. The true proportion is slightly less than 1%. People’s estimates tended to be higher, at around 10%.
It is typically the case that people overestimate the population of Muslims in the US. The overestimate is often interpreted in terms of fear. The idea is that people are more likely to pay attention to things that scare them, and this leads them to believe they are more common than they really are.
The “fear” explanation is intuitively plausible, but it may not be true. In a critical comparison, Landy also asked about the probability of other events that also had a small probability, but would be unlikely to make people scared (such as what proportion of the US population had served in the military).
It turned out that people also overestimated the probability of these rare but uninteresting events. In fact, the degree to which they overestimated these other events was practically identical to how much they overestimated the population of Muslims.
Landy’s result suggests that we simply have trouble in thinking about small probabilities, regardless of the topic. It may not be that some people overestimate the proportion of Muslims out of fear. Rather, it seems that we will overestimate the incidence of any rare event.
So how should we think about and respond to rare events?
One remedy might be to use what some researchers refer to as “metacognitive awareness”. This is being aware of how cognitive processes, like memory, work when we try to think about and estimate the frequency with which things happen.
One metacognitive cue you might use is how easy it is to remember a particular event, such as hearing about shark attacks. But simply reading-off the ease of recall is likely to be misleading. This is because your memory is biased by positive instances: going swimming and not being attacked by sharks is not surprising so it is not particularly memorable.
This failure of memory to deliver representative samples of evidence suggests a need to think carefully, not only about the bias in memory retrieval, but also in the samples available to us in the world.
Perversely, it suggests that when you want to work out how rare an event is (and an appropriate response), you should try to think about all the times it didn’t happen (negative instances) rather than those when it did!
So next time you are at the beach and contemplating taking a dip, just think of the millions of swimmers who have never been attacked by a shark, and not the relatively few who have.