How many times have you heard yourself or a friend say about an event, “I remember it as if it were yesterday. Every detail is etched in memory?" I found myself saying this last week in reference to 9/11, and I really meant it—even though I know, from a scientific point of view, it’s just not true.
How many times have you argued with someone close to you about the memory of a shared experience and been so absolutely positive that your recollection of the event was on the money, and that he or she was dead wrong?
We all like to think of our brains as cameras and our memories as snapshots carefully preserved for us to return to again and again. Similarly, we like to believe our thought processes—reflecting on a memory or something which just happened—are reliable, rational, and conscious.
Nothing could be farther from the truth.
The brain is bombarded by stimuli—every day, every minute, every second—and has to make sense of all those unconnected dots by filling in the blanks for us without our conscious awareness. What’s stored in our memory is nothing more than a shorthand notation of a moment, and the shorthanding also takes place outside of our conscious awareness. When we recall something—what happened on 9/11, the day you met your beloved, the events that led up to a pivotal moment in your life—your brain isn’t retrieving the memory but, as Daniel Gilbert has put it, reweaving the experience. Similarly, unconscious biases shape how we think about events past, present, and future.
Do you find that disconcerting? I certainly do.
So let’s look at the magician behind the curtain—the brain—and see if, by gaining an understanding of the tricks it plays, we can gain some control.
1. Suggestibility and the alteration of memory.
How permeable memory is, and how the reweaving of recall can be reshaped, is shown not just by the unreliability of eyewitness accounts but also by the relative ease with which “memories” can be altered, as the work of Elizabeth Lipton and others has shown. In one experiment, subjects watched a series of slides depicting a car hitting a pedestrian. Half of the participants saw the car approach a stop sign; the other half saw an image with a yield sign before the car hit the person. The subjects were asked 20 questions after watching the slide show. One asked about the sign; half of the participants were asked about the sign they’d seen (either stop or yield) and half were asked about the sign they didn’t see. And 41%of subjects who were asked about a sign they didn’t see later thought they’d seen it. Just asking a question was enough to infiltrate their memory!
Another experiment showed that entirely false memories can be induced without difficulty and fanfare. Participants watched four videos of crimes, each about four minutes long, including a bank robbery, a liquor story holdup, a burglary of a warehouse, and a domestic dispute. A week later, the participants were asked to fill out a multiple-choice questionnaire about the events which included 10 questions about each event they’d seen and 10 questions about a drug bust they hadn’t seen. The result: 64% reported remembering the drug bust video. Interestingly, when asked to provide an open-ended description of the drug bust, 75% of subjects imported details from the other four crimes but 24% of the details were totally new.
Memories can be manipulated deliberately—by implanting a false memory—but they can also be altered unconsciously simply by imagining, as another experiment showed. Researchers had participants go through a checklist of events from their childhoods; one example was falling and putting your hand through a window. Two weeks later, they asked the participants imagine the event in some detail for one minute—and just that one minute of imagining was enough to make 24% of participants confident that an event like that had happened to them when they’d reported earlier than it hadn’t.
2. The hindsight bias.
Our brains are in the business of making sense of the stimuli surrounding us and the events and encounters we experience. This happens automatically and unconsciously. As a result, events that seem surprising at first ultimately appear unsurprising or even inevitable. This process is known as the hindsight bias. Whether it’s the 2008 financial crisis, the winner of a football game or the results of an election, we will find ourselves saying—despite what odds we were giving or what opinions on the outcome we held beforehand—“I knew it would roll out this way.” Once something has happened, our earlier memories of what we thought might happen disappear and we are suddenly confident that we knew what would take place all along.
This particular bias makes events seem more predictable and explicable than they really are, and the world we live in appear more stable. Hence the Monday morning quarterbacking: “I knew our venture was doomed to failure because we hired the wrong marketing team” or “I knew she’d end up leaving me. She’s too impatient and unforgiving.”
The hindsight bias helps make the world we live in seem less chaotic and tends to soothe our upset. But it also amplifies our tendency to oversimplify situations and gets in the way of our actually thinking about what happened and figuring out what we can learn that might inform our actions in the future.
This is especially true when the hindsight bias kicks in after a failure, because it can stop us from counterfactual thinking that has us consider alternative actions we might have taken to assure success.
Interestingly, the digital world—packed with visual information—has, according to the work of Neal J. Roese and Kathleen D. Vohs, actually made the hindsight bias even stronger. They found that watching an animation more than doubled the hindsight bias because, as they write, “Animations can whitewash the guess work and assumptions that go into interpreting reconstructions. By creating a picture of one possibility, they make others seem less likely, even if they’re not.” Needless to say, that’s precisely why both prosecutors and defense lawyers love visual recreations.
3. The impact bias.
As discussed by Timothy Wilson and Daniel Gilbert, the impact bias refers to our inability to predict how a future event will make us feel, and for how long. This bias is true for the events that make us happy as well as those that make us sad; generally, people overestimate how powerfully an event will shape their lives and their emotions.
Why is that?
When we think about a future event, we have to create a mental picture of it. That sounds easy enough but when you’re imagining something you haven’t yet experienced, you’re likely to encounter what Wilson and Gilbert call the “misconstrual problem.” Put more simply, you’re bringing up the wrong picture. Say you are predicting how you’ll feel on your wedding day: You picture yourself walking down the aisle, resplendent in gown or tuxedo, transported into bliss. Every moment is perfect and you are happier than you’ve ever been. Of course, what you haven’t pictured is the sudden thunderstorm that forces the ceremony off the lawn and into the packed chapel, or the snide remarks your aunt keeps making about your new in-laws, or how horribly drunk two of your bridesmaids get.
When we imagine the future, we all oversimplify both the situation and our feelings about it, conveniently forgetting that most experiences are more textured and complicated than pure bliss.
We also make mistakes in how we frame situations when we’re making a decision. When we compare scenarios A and B, we tend to focus on what makes them different, instead of the ways in which they are similar. Say you’re considering moving from New York City to San Diego because you want to change your life. You think about the weather—no winter in San Diego—the cost of living (cheaper, too), and how you’ll be able to drive rather than taking a crowded subway or bus. But what you’re not focused on is that it’s your social life in New York—or lack of it— that’s making you unhappy and going to a place where you will arrive knowing no one is not necessarily going to make that better.
The impact bias saves us when bad things happen, because we recover faster than we think we will but, alas, it also gets in the way of our decision-making and our ability to figure out what will really make us happy.
4. The confirmation bias.
Would you use the following words or phrases to describe yourself when you make a decision or take a position for or against something?
- Good judge of the facts
- Attentive to reason
- Skilled at evaluating an argument
Sadly, this list is pretty much science fiction because of the confirmation bias, one of the many shortcuts the brain takes which leaves us thinking “fast,” as Daniel Kahneman puts it, and pretty much automatically, rather than carefully processing. Research shows that instead of judging and weighing all the facts, we listen to and give credence to those facts and arguments that align with or reflect beliefs we already hold. I'm sure you're shaking your head and saying, “Not me!” as you read this. Sorry; there’s no point in your fooling yourself.
Once again, this bias is unconscious and needs to be distinguished from the kind of conscious cherry-picking we do when we’re pitching a point of view to a boss or spouse, or in a courtroom. That’s knowing behavior; the confirmation bias has us thinking that we’re thinking and evaluating rationally when we’re actually not.
The research on the confirmation bias is voluminous and explains why if you’re for something—gun control or gun rights, to cite one example—you’re not likely to be swayed by the opposite argument. It would appear, then, that the “undecided” voter is just someone who hasn’t untangled a candidate’s messages sufficiently to figure out how closely they match his or her own beliefs.
The mind can be forced out of fast and automatic thinking and the confirmation bias into a deeper process, however, as two experiments conducted by Ivan Herandez and Jesse Lee Preston showed. The researchers chose two scenarios in which the confirmation bias is known to operate—discussions of capital punishment and juror decision-making. (Research shows that verdicts closely mirror the jurors’ initial impressions of the defendant.) What they found was that participants processed information differently depending on how much effort had to be put into reading it: Large, clear type resulted in fast processing and the confirmation bias; in contrast, a deliberately hard-to-read font, or pages that had been degraded through multiple rounds of photocopying forced participants into a more analytical mode and disarmed the biases. The authors wrote that just as speed bumps cause you to drive more slowly, difficulty in reading prompts "a slower, more careful mindset when making judgments, even when one comes to the issue with existing biases.”
All in all, the brain is one Hell of a magician, don’t you think?
Copyright© Peg Streep 2014
VISIT ME ON FACEBOOK: www.Facebook.com/PegStreepAuthor
READ MY NEW BOOK: Mastering the Art of Quitting: Why It Matters in Life, Love, and Work
READ Mean Mothers: Overcoming the Legacy of Hurt
Gilbert, Daniel. Stumbling on Happiness. New York: Vintage Books, 2006.
Schachter, Daniel L., “The Seven Sins of Memory: Insights from Psychology and Cognitive Neuroscience,” American Psychologist (March, 1989), vol. 54, no. 1, 182-203.
Loftus, Elizabeth F., David G, Miller, and Helen J. Burns, “Semantic Integration of Verbal Information into a Visual Memory,” Journal of Experimental Psychology: Human Learning and Memory (1978). Vol.4, no.1. 19-31,
Loftus, Elizabeth F. “Illusions of Memory,” Proceedings of the American Philosophical Society (March, 1998), vol. 142, no.1, 60-72.
Roese, Neal J. and Kathleen D. Vohs, “The Visualization Trap,” Harvard Business Review May 2010)
Wilson, Timothy D. and Daniel T. Gilbert, “Affective Forecasting,” Advances in Experimental Social Psychology (2003), vol, 35, 345-411.
Nickerson, Raymond S., “Confirmation Bias: A Ubiquitous Phenomenon in Many Guises, “ Review of General Psychology (1999), vol. 2, no. 2.: 173-230.
Kahneman, Daniel. Thinking, Fast and Slow. New York: Farrar, Straus & Giroux, 2011,
Hernandez, Ivan and Jesse Lee Preston, “Disfluency disrupts the confirmation bias,” Journal of Experimental Social Psychology (2013), 49: 178-182.