When I was researching Brain Sense, I visited Mandayam Srinivasan, Director of MIT's Laboratory for Human and Machine Haptics, better known as the Touch Lab. Haptics, he explained to me, is a broad term that encompasses both the sense of touch and the manipulation of objects by humans, by machines, or by a combination of the two in real or virtual environments.
Srinivasan said that one of the major goals of haptic research is to develop devices that will assist people who have lost the use of a limb to accident or disease. In 2008, researchers at the University of Pittsburgh made progress toward that goal when two monkeys in their laboratory successfully used the power of their brains alone to control robotic arms and hands. Unlike similar, earlier experiments in which brain-power controlled the movement of a cursor on a screen, this robotic action was three-dimensional and personal. The monkeys picked up food, fed themselves-even licked their robotic fingers.
To achieve this level of brain-machine communication, the researchers first analyzed the collective firings of neurons in the animals' brains when the monkeys moved their natural arms. Then the investigators wrote software that translated the patterns of neuronal firing into signals that controlled the movements of the robotic limb. Although the monkeys received no sense of touch feedback per se, they were able to control their robotic arms in a natural way, and the feedback-successful food retrieval and feeding-was sufficient to sustain the action. So precise was the control that the monkeys could adjust the grip of the robotic hand to accommodate varying sizes and thicknesses of food pieces. To the researchers' surprise, the monkeys learned that marshmallows and grapes would stick to their hands, so a firm grip wasn't necessary when those foods were on offer.
Such achievements in haptic research are laudable, but the field still has a long way to go. This week, researchers at the University of Michigan moved one step closer to a brain-computer interface that will allow a paralyzed person to "think" a motion and move a disabled limb or prosthesis. Their invention, the BioBolt, uses the body's skin like a conductor to wirelessly transmit the brain's neural signals to control a computer, which may eventually be used to reactivate a paralyzed limb.
The BioBolt looks like a bolt and is about the circumference of a dime, with a thumbnail-sized film of microcircuits attached to the bottom. The BioBolt is implanted in the skull beneath the skin and the film of microcircuits sits on the brain. The microcircuits act as microphones to "listen" to the overall pattern of firing neurons and associate them with a specific command from the brain. Those signals are amplified and filtered, then converted to digital signals and transmitted through the skin to a computer, explains one of BioBolt's inventors, Euisik Yoon. BioBolt keeps the power consumption low by using the skin as a conductor or a signal pathway.
BioBolt's developers hope that, eventually, they can find a way to transmit the signals through the skin to some device on the body, such as a watch or a pair of earrings, which will collect the signals, thereby eliminating the need for an off-site computer to process the signals. But BioBolt has already jumped one hurdle that has tripped up some researchers in the field in the past. Most such devices developed previously required that the skull remain open while the neural implants were in the head, which makes using them in a patient's daily life unrealistic. BioBolt does not penetrate the cortex (the brain's thin outer layer) and is completely covered by skin, so it greatly reduces the risk of infection.
The UM researchers believe that their BioBolt is a critical step toward a practical brain-computer interface that will allow a paralyzed person to move just by thinking about moving. The ultimate goal of brain-body interfaces is to transmit neural signals from the brain directly to the muscles of paralyzed limbs (circumventing the damaged nerves of the spinal cord). That technology is years away, but the BioBolt brings us one step closer.
For more information
Brain Sense, "Chapter 5: Nematodes, Haptics, and Brain-Machine Interfaces"
M. Velliste, S. Perel, M. C. Spalding, A. S. Whitford, and A. B. Schwartz, "Cortical Control of a Prosthetic Arm for Self- Feeding," Nature (June 19 2008) 453:1098-1109.
BioBolt Photo courtesy of Euisik Yoon of the University of Michigan
Monkey using robotic arm for feeding from "Brain-machine Interfaces
Sci-fi Concepts Make Clinical Inroads" by Brenda Patoine (for the Dana Foundation).