HM has written in previous posts about how annoyed he is about people’s fears of terrorists attacks. HM lived through the Cuban Missile Crisis where the threat of nuclear annihilation was very real. The threat of terrorism pales in comparison. The probability of an individual suffering a terrorist attack is extremely small. And even the number of lives lost during 9/ll was minuscule compared to the loss if a nuclear warhead had exploded over Manhattan. As a result of 9/ll many people stopped flying and got into their cars, The annual death toll on the road was on average 1100 higher than in the five preceding years.
The New Scientist piece that inspired this post has the same title as this post and was written by Sally Adee. She begins the article noting that evolution has given us an inbuilt fear factory. But by engaging a different way of thinking we can stop panicking and assess the real risks.
Adee draws upon Kahneman’s Two Process concept of cognition. System 1 is fast and the product of evolved biases shaped over many thousands of years. This worked well. If you saw a shadow in the grass and it was a lion and lived to tell the tale, you’d make sure to run the next time you saw a shadow in the grass. This inbuilt fear factory is highly susceptible to immediate experience, vivid images and personal stories. Security companies, political campaigns, tabloid newspapers and ad agencies prey on it. Adee notes that System 1 is good at catastrophic risk, but less good at risks that build up slowly over time—thus our lassitude in the face of climate change or our expanding waistlines.
She advises that when your risk judgment is motivated by fear, stop and think: what other, less obvious risks might I be missing? This amounts to engaging the more rigorous, analytical System 2 outlined by Kahneman. People who deal with probability and risk professionally, and have excelled at it use System 2 quite heavily. Successful bookies, professional card players and weather forecasters are heavy users of their System 2 processes. Risk consultant Dan Gardner notes that even though meteorologists get a bad rap, they tend to be highly calibrated, unlike most of us. One can never be right all the time. But one should be attempting to calibrating risk assessment with the objective world.
Andy Spicer, who studies organizational behavior at City University of London notes that part of the problem in the run-up to the financial collapse of 2008 was that individuals were no longer accountable for their own actions. “At banks, there was no direct relationship between what you did and the outcome. That produced irrational decisions.
Gardner says, “there’s one feature you see over and over in people with good risk intelligence. I think it wouldn’t be too grandiose to call it the universal train of risk intelligence—humility.” The world is complex, so be humble about what you know an you’ll come out better.
HM would note that there is such a thing as risk intelligence and that it can be increased. See the healthy memory blog post “Risk Intelligence.”
© Douglas Griffith and healthymemory.wordpress.com, 2016. Unauthorized use and/or duplication of this material without express and written permission from this blog’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Douglas Griffith and healthymemory.wordpress.com with appropriate and specific direction to the original content.