This Christmas morning, many youngsters will wake to find video games, gaming systems, handheld gaming consoles, smartphones, and laptops wrapped underneath their trees. Most of these children will unwrap their gifts with glee and spend much of their holiday break blasting zombies, slaying ogres, shooting enemy squadrons, riding virtual skateboards, tackling wide receiver avatars, and crafting pickaxes and mining ore.
Good times, for sure. But if you’re a parent, you can’t help but wonder about media reports that too much screen time is bad for kids. You’ve seen how a virtual blocky landscape can turn your active elementary-schooler into a couch potato. The glassy eyed stare that your middle-schooler gets after an afternoon of two dimensional soccer play. And the way that your high-schoolers’ smartphone makes her oblivious to your attempts to call her for dinner. Is there any scientific evidence that digital devices are detrimental to child development?
Scientists know for sure that children spend a lot of time with technology. Children between the ages of eight and eighteen use high technology for an average of seven-and-a-half hours a day. That’s fifty-three hours a week—more than your full-time job.
Knowing the effects of frequent technology use is another story. Most of the experiments on the effects of videogaming and other sorts of screen based entertainment are done on adults. That means that scientists really have no idea what children’s frequent exposure to technology is doing to them. But there are several strands of research that provide some hints.
Consider a study carried out by Gary Small, a neuroscientist at UCLA. When Small used functional magnetic resonance imaging (fMRI) to scan the brains of people reading pages from a book, he found no differences in brain activity between regular Internet users and Internet novices. But when both groups carried out a Google search, the frequent users showed twice as much signaling in a specific brain network that’s responsible for decision-making and complex reasoning. This suggests that technology use can make our brains more active.
Then Small did something really interesting. Over the next five days, he asked both groups to search the Internet for one hour a day. When they returned to the lab, Small found that the exact same neural circuitry became active in the Internet novices. Small had changed their brains. Five hours on the Internet and people’s brains had rewired themselves.
What do these findings mean for the typical child who spends hours a day in front of a screen? If five hours of Internet searching can change an adult’s brain, then what does an entire childhood heavy with technology do to a developing neural system? Imagine re-doing Small’s experiment with children—except replace the one hour a day for five days with seven-and-a-half hours a day for ten years. You’d expect a doozy of an effect.
Small’s findings really are no surprise. They fit with other studies showing that adult heavy technology users develop certain cognitive advantages, such as better short-term memory skills, faster reaction time, sharper peripheral vision, and superior hand-eye coordination than adults with limited technology experience. There’s even evidence that laparoscopic surgeons who are regular gamers make fewer operating room errors than their nongaming peers.
Do these studies mean that we should schedule time with the Mario Bros and Kirby if we want our children to get a cognitive leg up on their peers? Not exactly. Decades of research have shown that these very same cognitive skills are fostered outside of videogaming in real world pursuits. In pickup games, pretend play, backyard expeditions, and treehouse gossip sessions. An afternoon in the backyard making mud pies, building forts, and playing freeze tag always trumps an afternoon spent on the Barcalounger staring at a virtual angry bird.
Small’s findings demonstrating the brain changing effects of technology complement other research showing that high doses of specific sorts of experiences can strengthen supporting brain regions. Musicians have more grey matter in areas of the brain involved in finger movements. Athletes’ brains are meatier in regions responsible for hand-eye coordination.
Other work demonstrates that you don’t need to do anything at all to change your brain. When Harvard neuroscientist Alvaro Pascual-Leone asked volunteers to learn a five finger piano piece, he found that the parts of the motor cortex devoted to the needed finger movements had overtaken surrounding areas. This finding was expected on the basis of the typical brain changes found in musicians and athletes. Then Pascual-Leone asked other volunteers to hold their hands still while imagining moving their fingers to play the music piece. He found that the very same portions of the motor cortex that had expanded in participants who actually played the piano also had grown in those who had merely imagined it.
These findings mean that imagination can change the brain. Amazing! But they also make me wonder that if something as innocuous as imagining a piano lesson can bring about a visible physical change in brain structure, and therefore some presumably minor change in the way a player performs, what changes might something like long stints of imaginary warfare on violent videogame play bring about? We don’t know. That research still needs to be done.
What scientists know for sure is that technology is changing children’s lifestyles. The more time children spend with technology, the less time they spend socializing with real people. Preschoolers sit quietly side-by-side on the couch staring at their handheld consoles, kindergarteners play soccer on the screen and not the field, and middle-schoolers text their friends to tell them about Emma’s status update and the picture she posted of her new bedazzled smartphone cover. The worry is that as children center their friendships on screens, they are missing out on the sorts of personal interactions that help them develop important social skills.
There isn’t much research yet to confirm such suspicions, but there’s plenty of evidence that heavy adult technology users often lack in social skills. Add to that the fact that children who are socially awkward not only risk being unpopular in childhood, but that their lack of social skills increases their tendency toward later academic failure, criminality, drug abuse, and emotional disorders. On the flip side, children who are interpersonally skilled—who have a high social intelligence—do well in just about everything, even on things in the cognitive and academic realm. Children who are socially intelligent will pretty much stay that way throughout life and consequently tend to be very successful adults.
Whether technology will make for socially unaware children is conjecture at this point and will remain so until scientists do the right long-term studies. We can assume that our children’s neural networks differ in substantial ways from ours whose basic wiring was done when technology was less pervasive. We have some idea of how adults’ heavy use of technology can affect their brains, but we shouldn’t automatically apply these findings to children. Small’s research suggests that parents should be able to direct some of this technological wiring by giving children a range of experiences that involve high technology. And if so, then there seems to be a way to reap the cognitive benefits of modern technology while preserving traditional social skills. Make sure your child makes time for both. Balance is likely the key.