The Science of Fear

'The Science of Fear' by Daniel Gardner (ISBN 0452295467) Confirmation bias leads us to accept more readily perceived facts that keep to our existing worldview more willingly than objectively considering all of the evidence. Many corporate leaders leverage disruptive change by making targeted, courageous moves toward new market opportunities. Many companies face up to risk with a strategic framework based on extenuating and managing the probable consequences but that line of attack might build bigger protective walls without guarding against the greatest risks—the ones that are unidentified. The uncertainty advantage is something different: an approach that compels managers to recognize the unknown as a market differentiator and an opportunity to give a free rein to innovative solutions that appeal to customers, investors, strategic partners, regulators, and competitors. Concisely, it is an opportunity to go well beyond the characteristic meaning of risk management—that is, seeking ways to achieve the best of the worst outcomes—to create new and sustainable value out of confusion.

In his book, The Science of Fear: How the Culture of Fear Manipulate Brain, New York Times bestselling author Daniel Gardner describes some of our pitfalls when it comes to framing risk properly:

Once a belief is in place, we screen what we see and hear in a biased way that ensures our beliefs are “proven” correct. Psychologists have also discovered that people are vulnerable to something called group polarization—which means that when people who share beliefs get together in groups, they become more convinced that their beliefs are right and they become more extreme in their views. Put confirmation bias, group polarization, and culture together, and we start to understand why people can come to completely different views about which risks are frightening and which aren’t worth a second thought.

It’s also much easier to simply be afraid of that with which we can easily recall to memory. Gardner uses Daniel Kahneman’s two systems of thought to explain:

You may have just watched the evening news and seen a shocking report about someone like you being attacked in a quiet neighborhood at midday in Dallas. That crime may have been in another city in another state. It may have been a very unusual, even bizarre crime—the very qualities that got it on the evening news across the country. And it may be that if you think about this a little—if you get System Two involved—you would agree that this example really doesn’t tell you much about your chance of being attacked, which, according to the statistics, is incredibly tiny. But none of that matters. All that System One knows is that the example was recalled easily. Based on that alone, it concludes that risk is high and it triggers the alarm—and you feel afraid when you really shouldn’t.

Tagged
Posted in Investing and Finance Philosophy and Wisdom

Leave a Reply

Your email address will not be published. Required fields are marked *

*

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>