In Part One of this pair of posts, I offered the bad news that the human system of risk perception, which has done such a fabulous job of figuring out the relatively simple and obvious risks that we've faced so far, may not be the brightest bulb to light up the darkness of the complex risks we face in the future. The problem is that the human risk perception system is based more on emotion and instincts than on reason and rationality, and that bodes poorly for dealing with the immensely complex threat we all face from living unsustainably on Planet Earth. 6 billion of us - due to reach 9 billion within 40 years - are taking too much stuff from and dumping too much waste into a finite biological system. We are already starting to experience the consequences - from climate change and deforestation to loss of clean water, and fish from the ocean, we're even running low on basic non-renewable resources - yet we're counting on a risk perception system to save us that's better designed to protect us from snakes and the dark than global abstractions laced with technological complexity and unknowns.
That's the bad news. The good news is, we know it. We know that the human system of risk perception, emphasizing instinct over intellect and feelings over facts, can get things wrong. We know that our risk perception system, for all it's powers, is a risk in and of itself. And we have figured out a lot of the details about how the human risk perception system works. Hopefully we're smart enough to realize that if the system can get us into trouble, we'd better use what we know about how that system works to avoid its pitfalls.
The first four chapters of my book, How Risky Is it, Really? Why Our Fears Don't Always Match the Facts, describe the Affective Risk Response System; how it works...what makes some risks scarier than others...why we're too afraid of some smaller risks and not afraid enough of some of the big ones, like the ones arising from our unsustainable ways. For those details I'm afraid you'll have to read the book, and I hope you do. But here, for free, are a few initial suggestions, summarized from Chapter 5 "Closing the Perception Gap", for how we can use what we've learned about the psychology of risk perception to think about risk a little more carefully, and hopefully make healthier choices.
1. Take your time! Our risk perception system makes up its mind subconsciously, and quickly, before we have all the facts. That "Blink" instinct may be good for avoiding simple and immediate dangers, but it's not the most thoughtful way to figure out what to do about complex future threats like climate change. So in the name of making healthier choices, don't automatically go with what feels right in the first instant. Keep an open mind and give yourself some time, even just a few minutes, to get more information and think things through. Give the ‘thinking' part of the process a little room to do its part.
2. Don't be a tribal ditto head! Research has found that we shape our opinions to agree with the tribes/groups with which we most strongly identify. That strengthens the tribe, and the tribe's acceptance of us as members in good standing, both important because as social animals we depend on our tribes literally for our survival. But when the issue is your health, do you want to have your own opinion, or just somebody else's? Don't just get your information from people or organizations with which you already agree. And apply a little healthy skepticism to any source of information. You may love Greenpeace, or conservative Senator James Inhofe, but neither are reliably neutral sources of information about climate change.
3. Beware Optimism Bias. We're overly optimistic about what lies down the road, when the details are hazy. Try to envision things as if they are imminent. That will give you a more realistic feeling about the risk you're judging. (I bet the exciting SCUBA trip diving with sharks 6 months from now sounds a little scarier if you imagine yourself standing on the edge of the boat about to jump in, looking down at the fins in the water!)
4. Think about tradeoffs. Most choices involve both risks and benefits, but we usually put more emphasis on the risks. Which could be risky! If you worry so much about the risk of mercury you choose to forego seafood, you lose the heart-healthy benefits of the fish. And don't forget risk-risk tradeoffs, when to get rid of one, we wind up with another. Our fears of nuclear power made it more profitable for utilities to generate electricity with coal and oil. Neither are risk-free but we traded one risk for a much bigger one.
5. Don't be fooled by how a risk feels. A risk that's natural feels less scary that one that's human-made, but solar radiation is riskier than radiation from nuclear power or cell phones or power lines. Risks feel safer if you have a sense of control, but driving is riskier than flying. A risk you choose to engage in feels less risky than a risk that's imposed on you, but you're at greater risk driving and using a cell phone than you are from nearby drivers doing the same thing.
When you get in a car you wear a seatbelt, right? You know there is a risk out there on the road so you use the tool provided to help reduce that risk. Our knowledge of the psychology of risk perception is like that seat belt. We can use it as a tool for reducing the risk that can arise when, because of our instinctive reactions to risk, we get risk ‘wrong'. There are a lot more suggestions for making healthier choices about the risks you face, or fear, in Chapter 5, "Closing the Perception Gap", of How Risky Is it, Really? Why Our Fears Don't Always Match the Facts". It's important stuff, for the choices we make about the immediate risks we face as individuals, and the long term risks we face as a species. So in addition to these few quick suggestions, I have posted a long excerpt from the book, free, that goes into a lot more detail. I hope you find the excerpt useful.