The Sensory Revolution
Our senses are under constant threat from the stimuli, routines, and ailments of the modern world. Fortunately, neuroscience is inspiring remedies that not only restore sensory input but radically alter it.
By Matthew Hutson published May 1, 2018 - last reviewed on November 12, 2021
"It's one of those cases of scientific serendipity," says Ian Morgan, a cell biologist at Australian National University whose work on vision is shaping how entire nations structure their school days.
In the mid-1990s, he was studying how the neurotransmitter dopamine interacted with the neural circuitry of the retina. In his searches for relevant papers, research on myopia (nearsightedness) kept popping up. When he had some spare time, he decided to look into it.
"I discovered two things," Morgan says. "There was very strong evidence that the prevalence of myopia was going up dramatically in parts of the world. This didn't fit with what the textbooks said—that myopia was genetically determined. So I filed that away. And the other thing, which is not recognized even among some clinicians: When you've got a low level of myopia, it's just a matter of correcting it with glasses, contact lenses, or, ultimately, refractive surgery. But at higher levels, you get a whole set of other things happening." In these cases, myopia develops into biological changes that can cause uncorrectable vision loss, even legal blindness. "I thought, This is a fundamental scientific issue, but it also might be a really big health issue."
The answers Morgan eventually found have added to a growing swell of knowledge about how our senses work, what makes them go wrong, how the modern environment can mislead them, and new ways to deal with all of these problems. Often our sensory perceptions are vivid and all-consuming as we experience a guilt-inducing dessert, awe-inspiring vista, rejuvenating rock concert, romantic embrace, or the stabbing pain of injury. Other times, our senses operate imperceptibly, as when we walk a familiar route to work on autopilot, lost in thought.
Our senses shape our health, mental fitness, and very notions of reality. Understanding how we truly interface with the world can guide the actions we need to take, individually and collectively, for a fuller experience of the universe and ourselves.
A Second Look
Rates of myopia have been increasing since the early 1900s. In some parts of East and Southeast Asia, young adults already experience the condition at rates higher than 80 percent, and 10 to 20 percent have high myopia. There's a nearly universal association between myopia and increased school work, but that wouldn't fully explain the regional phenomenon. Morgan notes, for example, that Australians are pretty well educated, yet have low myopia rates. "We had a vague hypothesis that it was because Australian lifestyles are very outdoors-oriented," he says. Indeed, he and his colleague, Kathryn Rose from the University of Technology, Sydney, found that kids who spend more time outside are less likely to develop nearsightedness. But the mechanism wasn't known. Some researchers argued that close reading lengthened the eyeball or that viewing long distances protected it. Some argued that the vitamin D produced from sun exposure helped the eyes.
From his own research, Morgan knew two things: Sunlight produces dopamine in the eye, and dopamine prevents myopia. Other studies confirmed that bright sunlight helps the eyes grow to their proper shape; controlling for one's amount of reading, time indoors predicted myopia. And animals exposed to bright light, even without the UV rays that produce vitamin D, experienced less myopia. Large-scale studies of schoolchildren have since tested interventions: In a three-year study in China, 40 more minutes of recess reduced myopia rates by 23 percent, and in a Taiwanese study, locking students out of classrooms during recess for a year reduced rates by 53 percent. Morgan estimates that from ages 3 to 25, we need two to three hours of outdoor sunlight a day for optimal vision protection.
Mirroring the approach taken to address a range of other sensory challenges, myopia researchers have also looked to new technology. Electronic light boxes show some promise. Glass "orthokeratology" contact lenses worn while sleeping can help reshape the eyeball, and atropine eye drops have, in some cases, altered eye growth and slowed the development of myopia, but the long-term safety of both interventions is unknown. Morgan is also working on a greenhouse-like classroom, but its adoption would be expensive; it requires more air conditioning. “The cheapest way to get bright light exposure is just to get the kids outside,” he says. A national committee has recommended that Taiwanese schools make students spend more time outside; the government is considering whether to begin including myopia among its school performance indicators.
The lighting habits of the modern world cause other problems. For starters, artificial light is keeping us up. A recent paper found that exposing mice to continuous light for 24 weeks threw off their brains' circadian pacemaker, caused inflammation, and weakened muscle tissue. Messing with our day-night cycles starts a biological domino effect.
As neuroscientists gain a greater understanding of how our sensory inputs communicate with the brain and the rest of the body, they are helping to develop new tools for those living with sensory deprivations. A million and a half people have lost sight due to retinitis pigmentosa, the degradation of the retina's photoreceptors—cells that convert light into signals to the brain. "Bionic eyes" are being developed by groups including Second Sight in the United States, Retina Implants in Germany, Pixium Vision in France, and Bionic Vision in Australia. In each formulation, a camera mounted on a pair of glasses sends a signal to a small computer, also worn by the patient, which then sends a signal to an array of electrodes on or behind the retina. When the electrodes are active, the wearer sees small flashes of light called phosphenes. A few dozen electrodes are not enough to enable vision for facial recognition or reading, but they can help people with basic navigational tasks.
Nigel Lovell, a biomedical engineer at the University of New South Wales Sydney, started developing his Bionic Vision device 20 years ago. The current generation of the device is undergoing preclinical testing. Lovell is also working on gene therapies that could be incorporated into bionic eyes. Normally in gene therapy, a specially designed virus sneaks genes into cells, but that approach raises regulatory concerns. So Lovell is experimenting with electroporation, a method of using pulses of electricity to make cells more receptive to outside bits of DNA, which would then encourage nerve cells to grow tendrils toward the implanted electrodes, enhancing brain-machine communication.
Gene therapy can also directly repair errant retinal cells. In December, the FDA approved the first gene therapy for a genetic disease: Luxturna, a drug developed by Spark Therapeutics and the Children's Hospital of Philadelphia. A virus inserts a functioning version of the RPE65 gene into photoreceptors. A teenager who received the treatment told the FDA he could see the night sky for the first time.
Most of the world's 36 million blind people lost their vision due to nonretinal problems, such as cataracts and glaucoma. For them, Second Sight is working on a device similar to the bionic eye but that bypasses the eye and the optic nerve completely. Instead, it activates an array of electrodes on the visual cortex, an area on the surface of the brain at the back of the head that processes input from the eyes. Another group, from the Illinois Institute of Technology, has received $12 million from the National Institutes of Health to develop a device that will wirelessly transmit to up to 1,000 electrodes on the visual cortex. And the government's Defense Advanced Research Projects Agency (DARPA) just launched its Neural Engineering System Design (NESD) program, funding six labs developing methods for communicating with a million neurons for the restoration of either vision or hearing.
Implanting hardware in the eyeball or brain is one way to get at sensory loss. Another intriguing approach substitutes one sense for another. In 1969, Paul Bach-y-Rita described a set of experiments in which six blind subjects sat in a repurposed dentist's chair studded with a grid of 400 sharp nubs. They controlled a mounted television camera that would send signals to the grid, and electromagnets would vibrate the tips against the subjects' back according to the brightness of each pixel. After a few hours of training, the subjects—five of them blind since birth—learned to recognize a set of objects (a telephone, a toy horse) as well as other people in the room. The subjects spontaneously remarked that what they were sensing was not on their backs but in front of them. They were learning to see—with their skin.
A dentist's chair with a mounted camera is not portable, but in 1998 Bach-y-Rita founded a company that now sells a sleeker version: the BrainPort. A small camera on a pair of glasses sends signals to a grid of 400 electrodes that the wearer places on the tongue, an area much more sensitive than the back. Guided from inside their mouths instead of their eyes, wearers can navigate rooms and locate familiar objects.
Another product, vOICe (for "Oh, I see") turns images from a camera into one-second bursts of sound. Pixels are encoded by sound frequency (for vertical position), timing (for horizontal position, left to right), and volume (for brightness). With this device, people learn to recognize broad patterns as the sound sweeps across a scene.
These interventions succeed because the brain is extremely flexible and regions can adjust to take advantage of whatever information they find useful. One study found that after just five days of being blindfolded and performing a series of touch-intensive tasks, such as reading Braille, touch began to activate the visual cortex of subjects' brains. In fact, given its dedication to function rather than to a particular sensory modality, some researchers have made a case that this part of the brain should instead be called the "spatial cortex."
The brain is so adept at interpreting inputs that engineers and neuroscientists are exploring prostheses for entirely new senses, enabling us to experience everything from magnetic fields (by providing an internal compass) to someone else's thoughts (transmitted electronically from electrodes in one brain to another—in other words, telepathy). Why stop at overcoming blindness when you can give people X-ray vision?
Touch can substitute not only for vision but also for hearing. Some of the 360 million people with debilitating hearing loss can be helped by a cochlear implant or hearing aids, but the cost of the surgery and hardware for an implant can reach six figures, and many people who own hearing aids are too embarrassed to wear them. Stanford neuroscientist David Eagleman is hoping to help these people through his startup, NeoSensory.
One product Eagleman is developing is VEST, a vest embedded with 32 vibratory motors connected to a microphone and computer. He believes that with training, wearers will come to understand speech with their torsos. The concept is being field-tested now by two young girls who are blind and deaf. (The vest also appears in the second season of HBO's Westworld, for which Eagleman serves as scientific advisor.)
NeoSensory is also developing a wristband, known as Buzz, with eight vibratory motors. The aim is to help deaf people sense environmental sounds such as smoke detectors, sirens, barking dogs, and crying babies. It would also help them hear music or a friend's laughter and augment lip reading. "One of our first users was born deaf. When he put it on, he cried. It was the first time that the world had come to him," Eagleman says. A different version will address the high frequency hearing loss that affects a third of the population as they grow older. For them, the ear will do most of the work, but for sounds like s and t, the wristband will fine-tune listening. "It turns out that it's quite easy for people's brains to combine these two data sources," Eagleman says.
"Making sense of sound is one of the most complicated jobs we ask of our brain," says Nina Kraus, a neuroscientist at Northwestern University, "because it requires microsecond timing, much faster than visual information is processed in the brain." Sound is defined by its evolution over time. To understand language, we need to process up to 30 phonemes, or word elements, per second, any one of which can completely change the meaning of what we're hearing. We can also localize the source of a sound by detecting differences in its arrival time in each ear as narrow as 10 millionths of a second. By comparison, it takes a neuron 100 times as long to "spike," or initiate a signal to its neighbors.
"You can imagine that this system would be vulnerable to disruption," Kraus says. Teenagers who have been concussed are less able to distinguish sound from noise—for example, catching words among party chatter. Kraus has also found that children with autism have difficulty processing pitch and intonation in language, making it hard to differentiate statements from questions, among other things. It's possible that the social and cognitive deficits in autism lead to this auditory disruption, but it's also possible, she suggests, that mishearing language contributes to the social and cognitive problems.
Other people can hear fine except for one problem: They can't recognize distinct voices. A study published last year outlined the neural deficits in two patients with two forms of phonagnosia, or voice deafness. One subject had apperceptive phonagnosia and couldn't tell whether two voices belonged to the same person. When trying to recognize voices (but not spoken words), voice-sensitive regions of her right hemisphere were less active than they are in other people. The other patient had associative phonagnosia and couldn't name the famous voices he heard, even though they sounded familiar. When his brain tried to recognize voices (but not spoken words), there was less activity between voice-sensitive regions in the cortex and the amygdala, an area important for emotional processing, than there is in other people.
There is little that can be done to treat phonagnosia, but a better understanding of how we recognize voices might help people without the rare disorder. Hearing aids and cochlear implants aren't designed to amplify cues that help distinguish voices, says Darcy Kelley, a biologist who studies vocal communication at Columbia University. Grandmothers, for example, might pick up the phone and be unable to recognize a grandchild's voice. It's an underrecognized problem, Kelley says, but as the population ages, and as new techniques in artificial intelligence and signal processing are developed, it could get more attention. Neuroscientific studies of voice recognition will play an important role. Once we know what those auditory cues are, Kelley says, we could redesign both phones and hearing aids to take advantage of them.
In 2016, 11.5 million Americans misused prescription opioids, and more than 40,000 died from opioid overdoses. The economic cost of the crisis has been estimated at half a trillion dollars. It's clear that we need better ways of relieving pain. Unfortunately, there's no clear target in the brain for pain relief. Different aspects of pain—its location, its intensity, its context, and its emotional impact—are processed in different areas. The neural activity of someone experiencing pain in a limb, someone experiencing pain in a phantom limb, and someone experiencing a normally pain-provoking stimulus but who is genetically incapable of feeling pain, all look strikingly similar. Pain is a subjective experience, like beauty, says Allan Basbaum, a neuroscientist at the University of California, San Francisco. When two people look at a Mondrian painting, their retinas and visual cortices may respond in the same way, and yet "one individual is emotionally affected by the painting because they know about Mondrian, and they're willing to spend $15 million to buy the painting," Basbaum says, "and the other person sees it and just walks right by."
Basbaum believes we need to stop pain before injury signals can reach the brain. The difficulty is at least twofold—targeting only the source of the pain, and targeting only the pain, not any other sensation. Aspirin fails on the former count, novocaine on the latter. During one lecture on painkillers, Basbaum says, an audience member apologized to him for asking what the questioner thought was a dumb question: "How does the aspirin know where to go?" "I said, 'That's brilliant!'" Basbaum recalls. "'The aspirin has no clue where to go!'"
Basbaum points to four strategies being pursued by researchers. The first, gene therapy, is one he's personally exploring. In pain disorders, nerve cells generating pain signals often become uninhibited, causing pain with little or no provocation. Using viruses, one could insert genes into cells that express the particular proteins that inhibit these signals. In the second strategy, one could develop stem cells into new nerve cells to replace those that are malfunctioning. In the third, researchers have focused on blocking the so-called Nav1.7 voltage-gated sodium channel, a mechanism for enabling neurons to fire that is concentrated in pain-reception cells. Ideally, a drug could block this channel without interfering with other channels needed for touch perception. Finally, there's research using novel antibodies. These proteins are too large to pass the blood-brain barrier, so they won't affect brain functioning, but they can target the peptides that cause blood vessels to dilate in migraines or the nerve growth factors that contribute to the inflammation in osteoarthritis.
And there may be a fifth approach, Basbaum says—hypnosis. He cites cases of people correctly reporting an intense sensation after hypnosis without feeling the attendant pain. It's just more evidence of the odd ways pain is created in the brain.
Keeping in Touch
As tricky as selectively dulling unwanted sensations is precisely replacing lost ones. Approximately two million Americans have lost a limb—most often because of diabetes, arterial disease, or trauma—and researchers are developing ways to replace the limbs with robotic appendages that can both be controlled and provide tactile feedback. Prosthetic limbs can be outfitted with mechanical sensors that detect pressure—on, say, a fingertip—or joint position, and there are several ways to communicate this information back to the wearer. Electrodes or vibrating motors can be placed on the skin near the prosthesis. Electrodes can wrap around or sit inside the remaining nerve in the limb, or be placed on the somatosensory cortex of the brain.
In his 2015 State of the Union Address, President Obama praised DARPA's efforts in "creating revolutionary prosthetics, so that a veteran who gave his arms for his country can play catch with his kids again." One DARPA project, Hand Proprioception and Touch Interfaces (HAPTIX), has funded research at Case Western University, where scientists are working with a man who lost his right hand in an industrial accident. Force sensors on three fingertips of his robotic prosthetic hand send signals to 20 electrodes cuffing three nerves in his forearm. With the feedback turned on, his fine motor skills are strong enough to pick up cherries without crushing them.
Another DARPA program, Revolutionizing Prosthetics, is funding a team at the University of Pittsburgh that has implanted two electrode arrays in the motor cortex of a quadriplegic so that he can control a stand-alone robot arm with his mind. Two more arrays in his somatosensory cortex enable him to feel what the arm feels. He can consistently report which finger is being touched and says that it feels as though the sensation is coming from his own hand.
Sometimes sensation makes its way to the brain but doesn't alter behavior because the brain's wiring fails, as in stroke or localized brain damage. Neuroscientists Marc Schieber and Kevin Mazurek, both at the University of Rochester, have demonstrated a method that might bypass these downed lines. They've trained two monkeys to perform four instructed actions, such as turning a knob or pressing a button. But that instruction takes the form of an electrical signal sent to electrodes in the monkeys' premotor cortex, an area between the sensory cortices and the motor cortex, which controls muscle movement. Even without any sensory instruction, the monkeys were nearly 100 percent accurate at interpreting the signal and performing the correct action.
Whether the monkeys had any kind of conscious sensory perception—or whether a human would with similar input—is an open question. Schieber points to the phenomenon of blindsight, in which people with certain types of brain damage say they're blind and can't describe anything in their visual field, yet can, for example, navigate obstacles as they walk. They act on input they can't consciously perceive. Mazurek notes that one could inject signals into the premotor cortex from a camera, a microphone, or another part of the brain whose communication has been cut off.
Of course, touch does more than alert us to bodily harm and guide our movements. It also helps us build our social world. We experience affection, aggression, correction, and cooperation through physical contact with others. When that primal interface goes awry, we may misapprehend higher notions of trust, authority, and propriety. Recent research suggests tactile deficits may not be just a peripheral feature of autistic spectrum disorders but a foundational cause of their social and cognitive deficits.
"Taste is probably the most unusual of all the sensory systems we have, in that it is the one that is most resistant to change or damage," says Paul Breslin, a psychologist at Rutgers University and the Monell Chemical Senses Center. Unlike cells in the retina, cochlea, or spinal cord, both taste and olfactory cells can regenerate. Smell tends to fade as we age. But even when radiation therapy for mouth or throat cancer completely ablates the sense of taste, most of it returns within a year.
That's a good thing, because scientists have no idea how to replace taste with a prosthesis. There's no gustatory map in the brain as there is for vision or somatosensation. "There's no spot that you can find in the brain that's the spot where sweet comes from," Breslin says. There's no map in the mouth either, despite what you may have learned in school about different tastes on different parts of the tongue. If we understood why taste buds regenerate so easily, we might be able to use that knowledge to regrow heart or brain tissue, or missing limbs. For now, Breslin can give only a teleological explanation for their renewal: "In the absence of taste, people really don't want to eat, and if you don't eat, you die."
Taste does more than just encourage us to eat food and discourage us from consuming rotten things. Subconscious aspects of taste prime digestive and metabolic processes. Sensors all along the digestive tract, from the mouth to the intestines, signal the body, alerting it to what's about to be absorbed. "When you eat a meal, you have to deal with this onslaught of nutrients," Breslin says. Merely rinsing one's mouth with sugar water can release insulin. And Ivan Pavlov found that dogs wouldn't digest meat placed directly into their stomachs unless he also dusted their tongues with meat powder. Expectations affect endocrine secretions, proteins, and muscle contractions.
Scientists are also finding that taste plays a role in the immune system. Recently researchers discovered that a certain bitter taste receptor, T2R38, occurs in the upper respiratory tract and can detect the secretions of some bacteria. In response, it prompts the body to produce mucous and clear out that bacteria. "We would have said just a few years ago that these are completely disconnected areas that don't communicate with each other," Breslin says of the role of taste and smell in metabolism and immunity. "In fact, they seem to be intimately connected, to such a degree that they might even be part of one network. I regard that as revolutionary."
Understanding these connections could inform personal health decisions and even public policy. More than a third of American adults are obese. Some experts blame the epidemic on an "evolutionary mismatch" between our ancient instincts and our modern environment: We evolved to consume calories where and when we could get them, leaving us vulnerable to the overabundance of nutrients at modern supermarkets and restaurants. That can't be the whole story behind the epidemic, though: Affluent nations differ in obesity levels, and poverty leads to increases, not decreases, in obesity. Some scientists also speak about the role of food insecurity, which leads us to indulge more when we feel anxious about our supply of resources. So one should expect more obesity wherever one finds more poverty and economic inequality, as long as people aren't starving. Researchers call this the "insurance hypothesis."
Olivia Petit, a marketing researcher at INSEEC Business School, in Bordeaux, France, thinks the proper response is not to seek self-control by ignoring all those sensory enticements around us. "Actually, if you integrate your sensation information in your self-regulation process, that would be healthier," she says, advocating what she calls "embodied self-regulation." One strategy is to savor the food you eat, whether or not it's healthy. Researchers found that when people held orangeade in their mouths twice as long as normal, they drank a third less. What happens in the mouth doesn't just stay in the mouth; as Breslin noted, it signals to the rest of the body what's coming down the pipe, altering hunger levels.
Imagination can help, too. People eat fewer M&M's if they first picture themselves eating a lot instead of a few, which partially sates their craving. Also, people typically eat more food reflexively when they are served large portions, but Petit has found that this bias disappears when people are first asked to imagine in detail the sensory experience of eating the food.
Food scientists now hope to engineer better eating experiences. Petit and colleagues, for example, published a paper in January on "multisensory technology for flavor augmentation," noting that the color or crunchiness of a food affects its taste. Senses, as we now well know, rarely act alone.
Losing by a Nose
Smell is our oldest sense. One of our earliest functions as simple organisms was to detect helpful or harmful molecules in our environment and then seek or avoid them. The brain's olfactory bulb still sits alongside regions processing emotion. As a result—although scientists aren't sure of the exact mechanism—dysfunctions of smell are closely associated with mood disorders. People who lose their sense of smell become more depressed, and people with depression have a worse sense of smell.
It should come as no surprise that anosmia and hyposmia have been shown to increase depression. We use smell to detect hazards such as smoke or gas, and without that ability, people can become anxious about their safety. Loss of smell also reduces the pleasures of eating and socializing over food. It reduces sex drive. And it raises levels of concern about bodily hygiene. People with anosmia might shower several times a day, overload on fragrances, or avoid going out altogether.
The most common causes of smell loss have traditionally been disease, infection, and trauma, but olfaction faces a new threat—air pollution. Several studies have found that residents of smoggier cities have worse senses of smell than people living elsewhere. Compared with residents of Mexico City, people in the cleaner city of Tlaxcala required weaker aromas to detect coffee and orange drink and better distinguished between atole and horchata, two beverages that smell similar. Another study in Mexico City found accumulations of cells in the nasal passages indicating damaged DNA. Still others found damage all the way up to the brain tissue, with inflammation, plaque, and even particles of pollution embedded in the olfactory bulb.
Some types of smell loss respond to antihistamines or topical steroids. Researchers are also exploring gene therapies and stem-cell treatments to enhance the system's partial ability to repair itself. It turns out that we can also help people by using what are basically magic markers. In a recent study, researchers asked people who had lost some smell due to upper respiratory tract infections to use "Sniffin' Sticks"—felt-tip pens scented with weak or strong fragrances of rose, eucalyptus, lemon, and cloves—twice a day. In each session, they sniffed every pen, then repeated the process. After four months, 15 percent of subjects who had used low-concentration pens improved, and 25 percent of those who sniffed the high-concentration pens improved. Markers can't save us all, though: City planners need to grasp that pollution doesn't just cloud residents' views but permanently dulls their senses.
Science may help us replace or substitute for lost senses, but rarely can our engineering match that of the brain. We're better off nourishing the senses we have while we have them. "The general story about sensory perception—which is so interesting—is that it's all a construction in your brain," Eagleman says. "All it ever has are electrical spikes and chemical releases happening in the dark, and it constructs your whole sensory world."
Submit your response to this story to firstname.lastname@example.org. If you would like us to consider your letter for publication, please include your name, city, and state. Letters may be edited for length and clarity.
Pick up a copy of Psychology Today on newsstands now or subscribe to read the rest of the latest issue.
Facebook image: janinajaak/Shutterstock
LinkedIn image: Evgenyrychko/Shutterstock