The upside to researching is that you’re always learning something new. The downside is that, sometimes, you have to trash some of your most treasured beliefs and illusions. Both happened during the course of writing my new book, Mastering the Art of Quitting. My image of myself as a reasoning person, attentive to detail and sentient, and a thriving realist took a beating from which I’ve yet to recover.
How about you? Do you think you’re thinking when you make a decision? It’s the beginning of a new year and the likelihood is that you’re thinking about changing up parts of your life. Are your thought processes dependable? The science says not. The degree to which our thinking is unconscious, automatic, influenced by external cues we don’t even register as being there, and biased in one way or another may come as a shock to you, as it did to me. I’ve cast these thoughts about thinking into the first person for quick identification.
1. I base my decisions on facts.
A special shout-out to Nobel Prize winners Daniel Kahneman and the late Amos Twersky for giving us the “availability heuristic,” which explains the quick and ready answers and facts that pop in our heads which, alas, have nothing to do with thinking and are the ones we’re most likely to rely on when we make a choice or decision. The bottom line is that this mental shortcut —which highlights the most recent, most often repeated, and most mentally available “facts”—results in our overestimating the importance of certain bits of information and lures us into thinking that good things as well as bad ones are more probable than they actually are. This kind of fast “connect the dots” thinking was valuable in human evolutionary history when dangers and perils—as well as rewards— were largely physical and required fast responses. In 2014, most of our choices are far more benign. In a hyper-connected world, though, we’re even more susceptible to the “available” facts as they pop up.
Quick quiz here: The animal most likely to kill an American? The word “shark” has doubtless floated into your head, as it did into mine, but the real answer is “cow,” followed by “horse,” according to a study published by UC Santa Barbara. The reason you thought “shark” isn’t the frequency of shark attacks but the disproportionate amount of media attention they get and the availability heuristic guarantees it will come to mind first. You’d be safer if you paid more attention to Elsie and Flicka, no matter how many times you’ve seen Jaws. This holds true for good things (your neighbor made a killing on the market and so will you) and the bad (the apparent ubiquity of sinkholes).
2. I weigh the pros and cons carefully.
Not true, either. Human beings, it turns out, are a conservative bunch and even though we love to think of ourselves as open to opportunity and able to take risks, the chances are excellent that, as you weigh your options, you’ve focused on what you’ve already got “sunk” into whatever decision you’re pondering. It’s called the “sunk-cost fallacy” and the reality is that you’re really going to be thinking about the time, energy, or money you’ve already got invested more than anything else. This particular fallacy is the engine for our propensity for hanging on to unsatisfying relationships, jobs, and everything else long past their expiration date. So much for a level playing field based on reasoning.
3. I think more logically than other people.
Probably not. Study after study has shown that Americans tend to think of themselves as “above average” in almost every domain! A famous study in the 1970s conducted by the College Board showed that 70% of those surveyed thought they had better than average leadership skills. Even more astonishing, when asked to rate their ability to get along with others, 60% put themselves in the top 10%, and 25% ranked themselves as being in the first percentile! (If that were true, where all those people we have trouble dealing with coming from?)
It’s far more likely, actually, that you’re overestimating your ability to think, as well as your skills and chances at success, so you can scrap all those affirmations meant to boost your self-esteem and try taking a hard look at yourself, even though the likelihood is that you’re not going to come up with an assessment that’s actually realistic. That’s exactly what a paper by David Dunning, Chip Heath, and Jerry M. Suls suggests; it turns out that almost everything we think about ourselves is colored by one bias or another, including our ability to forecast how generous we’ll be, how quickly we will get something done (not surprisingly, people underestimate the time anything will take), along with how we’ll act or react in some future situation.
Garrison Keillor had it right: “Welcome to Lake Woebegone, where all the women are strong, all the men are good-looking, and all the children are above average.”
4. I am an objective thinker.
Our collective lack of objectivity is, in part, a corollary of the “above average effect” and the general assortment of biases which accompany our “thinking.” Interestingly, while we’re all pretty adept at noticing the biases in other people’s thinking and arguments, we’ve actually got what researchers Emily Pronin, Daniel Lin, and Lee Ross have called a “bias blind spot” when it comes to ourselves. Surprise, surprise: I’m objective and you’re not! In fact, generally, when it comes to self-assessment, David Dunning and his colleagues summarize the situation with these words: “In general, people’s self-views hold only a tenuous relationship with their actual behavior and performance.” And they go on to explain why in somewhat horrifying detail for close to forty pages!
5. I’m good at anticipating my own reactions.
While it’s true enough that, according to Timothy Wilson and Daniel Gilbert, we’re pretty good to figuring out whether a future situation will make us feel happy or unhappy, good about ourselves or miserable—that is, when the choices are simply oppositional—human beings tend to oversimplify their vision of how they’ll respond, and are downright lousy at predicting when a situation is relatively complex or nuanced, or when there might be a certain amount of motivational conflict.
The best and perhaps most persuasive study I’ve read was conducted by Julia Woodzicka and Marianne La France who conducted an experiment during which they asked close to 200 women how they would react if a slightly older male asked them inappropriate questions —such as whether you had a boyfriend, whether men found you desirable, whether you thought women should wear bras to work—during an interview. Some 62% of the women imagined that they’d be proactive, telling the guy off in some way. 28% said they’d simply bail and walk out. 65% said they’d refuse to answer at least one question.
I know you’re all cheering and saying “Brava!” to those young women for standing up for themselves but then the researchers had the participants go for what they believed was a real interview for a lab assistant position. Half of the women were asked the harassing questions by the male interviewer; the control group was asked weird and random questions which weren’t harassing. More than half of the women ignored the harassment. And while 36% asked why they were being asked these questions, four out of six only asked at the end of the interview. No one walked out.
The point is that when we imagine a future scenario, we do so in a vastly simplified way. Not only that but it’s our very best self—the one who is genuinely above-average—who shows up to deal. The more complicated a future situation turns out to be (you don’t like being harassed but you really need a job), the more unlikely it is that your prediction will be right.
6. I pay close attention to detail.
Not anywhere as close as you think but, in this case, your brain—and mine —is largely to blame. The brain is bombarded by so many stimuli that it has to shortcut so that you can form a picture, unconsciously filling in the blanks so that you achieve coherence; memory is not a video camera. You’re not aware that you’re doing this, of course, but it explains why eyewitness accounts are notoriously unreliable; they may be heartfelt but they are not necessarily true.
Perhaps the most graphic and renowned experiment to show how limited our powers of perception are when we are concentrating—and it became a book too—was conducted by Daniel Simons and Christopher Chabris, and their article was called “Gorillas in Our Midst” (cute, no?). The experimenters had participants watching a video of a basketball game either count the number of passes or bounces among the players; seconds into the video, someone dressed in a gorilla costume walked into the middle of the players. The number of people who missed the gorilla? I know you’re thinking how could anyone miss the gorilla but the truth is that more than half of the people did.
You are doubtless identifying yourself with the people who did see the gorilla—I know, I felt the same way—but that’s probably thanks to the above average effect. While the kind of “blindness” this experiment exposed is called “inattentional blindness,” but other experiments demonstrate what’s called “change blindness” as well which refers to the inability to notice or register an alteration of an important detail. Studies have shown that people fail to notice that heads on a photograph of two people have been switched (!!); similarly, an experiment conducted by Daniel Simons revealed that change blindness actually took place in real-life situations. An experimenter posed as someone lost on a college campus, map in hand; he solicited the advice of passersby. But, as the experimenter and the pedestrian were talking, two men carrying a door passed between them, blocking the experimenter from view for a few moments. It was then that another experimenter took the first experimenter’s place, also holding a map. Once again, only half of the pedestrians noticed the switch!
The upshot of all this? You need to check your thinking if you’re about to make a change or a decision which depends on your ability to think. Descartes famously said Je pense donc je suis (I think therefore I am). It would appear that the truth is a tad more complicated than that.
Copyright ©2014 Peg Streep
VISIT ME ON FACEBOOK: www.Facebook.com/PegStreepAuthor
READ MY NEW BOOK: Mastering the Art of Quitting: Why It Matters in Life, Love, and Work
Kahneman, Daniel. Thinking, Fast and Slow. New York:Farrar, Straus& Giroux, 2011
Dunning, David, Chip Heath, and Jerry M. Suls, “Flawed Self-Assessment: Implications for Health, Education, and the Workplace,” Psychological Science in the Public InterestDecember 2004 vol. 5 no. 3 69-106.
Pronin, Emily, Daniel Y. Yin, and Lee Ross, “The Bias Blind Spot: Perceptions of Bias in Self versus Others,” Personality and Social Psychology Bulletin, 28, no. 3(March 2002): 369-381
Wilson, Timothy and Daniel Gilbert, “Affective Forecasting,” Advances in Experimental Social Psychology, 35 (2003): 346-411.
Woodzicka, Julia and Marianne La France, “Real versus Imagined Gender Harassment,” Journal of Social Issues, 57, no. 1 (2001): 15-39.
Simons, Daniel J. and Christopher F. Chabris, “Gorillas in our midst: sustained inattentional blindness for dynamic events,” Perception, vol, 28 (1999), 1059-1074.