Neuroscience
BCI and AI Allow the Mind to Control Robot Arm
Stroke survivor uses mind to control robotics using BCI and AI machine learning.
Posted March 15, 2025 Reviewed by Kaja Perina
Scientists at University of California San Francisco (UCSF) have published in Cell a new study that demonstrates a brain-computer interface (BCI) powered by artificial intelligence (AI) machine learning that is capable of empowering a paralyzed man to control a robotic arm using his thoughts for seven months without requiring recalibration. This study shows significant advancement for brain-computer interfaces given prior BCIs require adjustments after only a day or two according to a UCSF release.
Why Brain-Computer Interfaces?
Brain-computer interfaces have the potential to be life-changing assistive technology with multiple purposes, including helping those with traumatic brain injury, spinal cord injuries, paralyzed, stroke survivors, and neurological disorders.
BCIs may help those who have lost the ability to move or speak to control external devices with thoughts. Other potential uses for BCIs include gaming, home device management, communications, and more applications. The trends accountable for the BCI market growth include an aging population, increasing development in assistive technology to help the paralyzed, and other factors. The market size for BCIs is expected to grow at an annual compound growth rate of 18% and reach USD 6.52 billion by 2030, which is more than double the current market size of USD 2.83 billion in 2025 according to Grand View Research estimates.
How Brain-Computer Interfaces Work
BCIs solutions consist of hardware and software components. The BCI hardware component can be labeled in three categories; 1) invasive—require neurosurgery or surgical robotics to install, 2) partially invasive such as a stent placed through endovascular surgery, or 3) noninvasive such as functional magnetic resonance imaging (fMRI) or electroencephalogram (EEG).
The electrodes of the BCI hardware enables capturing recordings of neural activity. These brain recordings consist of complex, high-dimensional data. The software component of BCIs decodes brain activity in order to predict the user’s intended action. Detecting patterns in massive amounts of complex, noisy data is nearly an impossible task for humans, but a good fit for artificial intelligence machine learning.
A BCI Breakthrough
What sets this UCSF study apart is the increase in long-term stability from a few days to seven months, the ability to accurately adjust to daily shifts in brain activity with AI adaptive learning, and enabling a paralyzed man to successfully control a robotic arm by imagining movement with his brain.
“Studies in animals have indicated that neural representations can experience drift—changes in the correlation between activity and behavior over time,” wrote corresponding author and neurologist Karunesh Ganguly, MD, PhD, along with co-authors Nikhilesh Natraj, Sarah Seko, Reza Abiri, Runfeng Miao, Hongyi Yan, Yasmin Graham, Adelyn Tu-Chan, and Edward Chang.
According to USCF, the researchers were inspired by the observation of brain plasticity in animals; the patterns of neural activity for particular motions change day-to-day during the learning process.
“Our nervous system needs to balance maintaining a stable neural representation of a large repertoire of well-rehearsed actions while also facilitating new learning,” one scientist wrote. “We use the term ‘representation’ to refer to the distribution of activity patterns during repeated performance of an action.”
The scientists hypothesized that BCI performance could be enhanced by applying AI machine learning to account for day-to-day neural activity changes due to learning.
To test this hypothesis, the UCSF researchers recorded electrocorticography (ECoG) activity in the left sensorimotor cortex brain region of a study participant who had lost control of the muscles required to speak (anarthria) and severe muscle weakness of both legs and arms (tetraparesis).
The study subject was a 41-year-old male who had a stroke years prior that left him wheelchair bound with complete lower-limb paralysis, loss of control of the muscles required to speak (anarthria), little to no capability for upper limb movements, and severe muscle weakness of both legs and arms (tetraparesis).
An invasive BCI was implanted via neurosurgery. The participant had a PMT corporation 128-channel electrocorticography (ECoG) array to send neural activity data in the left sensorimotor cortex brain region to be connected to a BCI system by Blackrock Neurotech. AI machine learning was used to decode the complex neural activity.
“We use intracortical brain-computer interfaces (BCIs) based on mesoscale electrocorticography (ECoG) to understand principles of representational stability and plasticity,” the researchers wrote.
The sensors on the subject’s brain captured the neural activity during the mental exercise of imagining the movement of various parts of his body. The researchers found that the overall shape of the brain’s representations of these movements did not change, but the locations varied a little bit day-to-day.
To train the AI model, brain activity data was captured over two weeks while the participant imagined moving his hands, fingers, or thumbs. To improve the participant’s ability to control a robotic arm using his thoughts, he first practiced on a virtual robot arm which provided feedback on performance accuracy. After learning how to control a virtual robot arm using thoughts, the participant started to practice on an actual robot arm.
The practice of using the virtual robot arm accelerated the subject's ability to use the actual robot arm to perform tasks such as grasping, reaching, transporting, and manipulating objects, and more. The researchers plan to continue to improve the AI model’s speed and motion fluidity, as well as analyze the BCI performance in a home setting, outside of the lab environment.
AI machine learning is accelerating brain-computer interface performance; offering a glimmer of hope for those suffering from neurological disorders, brain and spinal cord injuries, and other disabilities, with life-changing assistive technology in the not-so-distant future ahead.
Copyright © 2025 Cami Rosso All rights reserved.