It was an experiment that rocked the world. People were asked to watch a video clip of six young college students passing basketballs around. Three ball players wore white, three wore black. The task assigned to viewers was to count the number of times the players dressed in white passed the ball.
Were the students good at counting? Who cares! The amazing thing about this experiment was what the students missed: A woman wearing a gorilla suit who walked across the screen, thumped her chest and the camera, and then sauntered away.
When Daniel Simon and Christopher Chabris tried out different versions of this experiment on 192 people, more than half failed to notice the weird event unfolding right before their eyes.
Follow-up studies have confirmed the effect, and if you’re thinking college students are a particularly unobservant group, think again. A recent study of senior citizens found that people over the age of 60 were far less likely than younger people to notice the gorilla.
The lesson is clear. Human beings have a limited ability to divide their attention. If you focus on one task, you’re apt to miss information in the environment that is irrelevant to that task. Even if that information is right under your nose.
That news is bad enough — particularly if you share the road with a driver who is talking on the phone. But what happens if you try to focus on more than one thing? What if you are a multi-tasker, coping with several different demands on your attention at once?
UCLA psychologist Patricia Greenfield is worried about more than your driving. She’s concerned about your ability to think critically, analyze, concentrate, and reflect. And she’s worried about the kids growing up today who spend far more time playing video games and using smart phones than they do reading a book.
Sure, people can think and learn while multi-tasking. But the quality of their thinking and learning is probably lower.
For instance, there’s the study that presented people with a weather prediction game. Participants were given cues and then asked to predict the weather. Some people were allowed to concentrate fully on the game. Others were asked to multitask. What happened?
The multi-taskers performed no worse than the other people, but they seemed to be on “autopilot.” When asked, they had more trouble recalling what cues they had used to get the right answers. Overall, the multi-taskers were less aware of their own thinking processes.
In another experiment, college students were asked to watch CNN Headline News. Some students watched the regular newscast, complete with all the distractions—the flashing icons, the stock quotes, and the text messages that scroll along the bottom of the screen. Other students watched the same newscast, but with the visual distractions edited out.
Students remembered fewer facts from the main news stories when they watched the visually-cluttered newscast.
And if you’ve ever wondered about the wisdom of letting kids surf the internet while they are listening to their teacher, here’s the study for you. Researchers gave college students laptops and asked them to attend a lecture. Half of the students were encouraged to use their laptops during the lecture. The other half were told to keep their laptops closed. In the pop quiz that followed, the students who surfed did considerably worse.
But wait a minute. Doesn’t habitual multitasking make you better at juggling your attention? Don’t video games — the kind that require you to track several objects at once — train you to multitask? Some people might argue that today’s kids are going to grow up with multi-tasking super-powers. But I’m very doubtful.
A few years ago, Eyal Ophir and his colleagues pitted the abilities of “chronically heavy media multitaskers” against those of people who tend to avoid high-tech multitasking.
What the researchers discovered was sobering. Not only were the chronic multitaskers no better than the one-job-at-a-time folks. They were actually worse.
When given the same tasks to perform, the chronic multitaskers had more trouble paying attention, selecting the right information to remember, and switching from one job to another.
And a more recent study suggests that frequent video gaming doesn’t train people to better cope with real-world multitasking. Video gamers are no better than anybody else at carrying on a phone conversation while driving a car.
So I'm more than a little concerned about the effects of high tech multitasking. If a child divides his attention between iPods and email and smart phones and youtube and television and computer games, he's not just spending less time reading or learning algebra. He is also becoming accustomed to the idea that fractured attention and superficial thinking are acceptable ways to get along. What happens when it's time to weigh the benefits and risks of a medical therapy? Or vote in an election? Or serve on a jury?
Studies show that people are easily manipulated by simple verbal tricks. If you tell them that 1 in 10 patients die after an operation, they view the operation as more dangerous than if you tell them that 9 out of 10 patients live. With reflection, people recognize such errors. But multitasking distracts us from reflection. Is multitasking making us dangerously stupid? It's time we gave this question some serious thought.
Greenfield PM. 2009. Technology and Informal Education: What Is Taught, What Is Learned. Science 323 (5910): 69-71
image of computer user by Ravn / wikimedia commons
Portions of this article appeared in an earlier post, "Is technology making kids less thoughtful?" for the BabyCenter Blog. Text © 2012 Gwen Dewar, Ph.D., all rights reserved.