Skip to main content
Artificial Intelligence

Can AI Predict Behavior from Brain Activity?

AI tool may accelerate future neuroscience and behavioral science research.

Geralt/Pixabay
Source: Geralt/Pixabay

A new neuroscience study backed with funding from Wellcome and the European Research Council demonstrates how an AI deep learning algorithm is able to predict behavior by decoding brain activity.

“The neural code provides a complex, non-linear representation of stimuli, behaviors, and cognitive states,” wrote scientists affiliated with the Kavli Institute for Systems Neuroscience, the Max Planck Institute for Human Cognitive and Brain Sciences, UCL, and other institutions in eLife. “Reading this code is one of the primary goals of neuroscience – promising to provide insights into the computations performed by neural circuits.”

The decoding of brain data from imaging and neural recordings is a complex, time-consuming undertaking that the study’s scientists characterize as “a non-trivial problem, requiring strong prior knowledge about the variables encoded and, crucially, the form in which they are represented.”

In efforts to decipher the neural code, the researchers created a convolutional neural network (CNN) to predict behaviors or other co-recorded stimuli from minimally processed, wide-band neural data. The AI model transforms the input brain data into frequency space through a wavelet transformation.

They first trained the AI deep neural network with raw electrophysiological recordings from the neurons located in the hippocampus of rodents, specifically the CA1 pyramidal cell layer. The hippocampus is the area of the brain that plays a role in the limbic system, and is associated with memory, learning, and emotions.

The AI model was able to predict the rodent’s location, speed, and head direction from learning from the training data. Next, the scientists tested the AI deep neural network with electrophysiological data from two-photon calcium imaging data from the auditory cortex that was also recorded while rodents were exploring a virtual environment. In the last step, the team tested the AI algorithm with electrocorticography (ECoG) recordings taken from brains of humans while moving their fingers.

“We show successful decoding of finger movement, auditory stimuli, and spatial behaviors – including a novel representation of head direction - from raw neural activity,” the researchers reported.

Additionally, according to the scientists, the AI model has a high potential to perform well with brain data used in most neuroscience research environments, such as functional magnetic resonance imaging (fMRI), magnetoencephalography (MEG), and electroencephalogram (EEG).

“In sum, we believe deep-learning based frameworks such as this constitute a valuable tool for experimental neuroscientists, being able to provide a general overview as to whether a variable is encoded in time-series data and also providing detailed information about the nature of that encoding – when, where, and in what frequency bands it is present,” concluded the scientists.

Copyright © 2021 Cami Rosso All rights reserved.

advertisement
More from Cami Rosso
More from Psychology Today
More from Cami Rosso
More from Psychology Today