A plug for “The Science of Fear”

by bledsoe on September 25, 2008

I very much enjoyed "The Science of Fear," partly because I work in a field that's specifically about accurately describing things based on hard data, but also because I just like knowing that my assessment of reality actually matches reality, or at least as much as it can in an an uncertain world.

Gardner's basic premise is that humans have two primary ways of understanding the world (or, in particular, of assessing risk): we use our "heads" and we use our "guts" or instincts. Unfortunately, we use our guts, which are driven primarily by our emotions, far more than we use our rational heads, and this often leads us to erroneous conclusions about reality, risk, and things that we should (and shouldn't) be afraid of.

Gardner explores a number of examples where this tendency gets us into trouble (e.g., fear of terrorism, flying, chemicals, etc.), and also how it allows organizations to manipulate us (e.g., the news media, advertisers, politicians), but he first explains several specific ways that research has shown this tendency works against us in our attempts to accurately assess risk.

  1. Confirmation bias - once a belief is in place, we tend to take note of evidence that supports that belief, and ignore or screen out evidence which might contradict it.
  2. Group polarization - when people who share the same beliefs get together in groups, they tend to become more convinced of, and more extreme in, those beliefs.
  3. The availability heuristic (aka, The Example Rule) - if examples of something can be recalled easily (e.g., a horrific shark attack), we tend to assume that thing must be common.
  4. The Law of Similarity (aka, Appearance Equals Reality) - If it looks like a duck, we tend to assume it's a duck.
  5. The Anchoring and Adjustment heuristic (aka, the Anchoring Rule) - when we are uncertain about the correct numerical answer to a question and attempt to make a guess, we tend to grasp at the most recent number we've heard and adjust it slightly, even if that number is completely irrelevant to the question at hand.
  6. The representative heuristic (aka, The Rule of Typical Things) - how "typical" we thing something is, influences our assessment of how statistically likely it is (similar to The Example Rule).
  7. The Affect heuristic (aka, The Good-Bad Rule) - if something "seems" dangerous, we tend to assume it probably is dangerous. If it "seems" harmless, we tend to assume it probably is harmless.
  8. A generally poor understanding of numbers and probability - Gardner actually names several specific incarnations of this, including "denominator blindness" and "probability blindness," as well as discussing the fact that we are almost always a sucker for a good story (especially a vivid, violent, and/or emotional one) and the fact that risk is inevitable and almost always involves tradeoffs.

Gardner does a good job of explaining and applying these concepts in ways that are accessible to non-scientists. A good introduction to the field of risk assessment.

Previous post:

Next post: