Scientists Bridge Neuroscience With AI Machine Learning
Researchers discover brain-inspired algorithms for faster AI learning.
Posted May 15, 2020
Artificial intelligence (AI) deep learning is somewhat inspired by the human intelligence of the biological brain. Recently researchers from Bar-Ilan University demonstrated that by increasing the frequency of training speeds up the neuronal adaptation process, effectively connecting experimental neuroscience with advanced AI learning algorithms.
The Bar-Ilan University study was led by Professor Ido Kanter in collaboration with Amir Goldental, Shiri Hodassman, Yael Tugendhaft, Yuval Meir, Roni Vardi, and Shira Sardi. The work was published in Scientific Reports on April 23, 2020. The team carried out a novel type of experimentation using in vitro neuronal cultures. The researchers’ goal was to test their hypotheses that contrary to commonly held views, the biological brain’s learning speed is not extremely slow, and that there may be learning accelerators in the brain. To the scientists’ surprise, their experiments proved both hypotheses.
To run the tests, the researchers placed neuronal cultures with synaptic blockers on a multi-electrode array, then stimulated it via its dendrites. To quantify the effect of neuronal adaptation, the researchers compared the amplitudes of the intracellular responses and extracellular stimulation before and after the training process. They examined dendritic adaptation at various stimulation frequencies.
The researchers measured the enhanced responses after a minute after the training. Then they observed and compared the dendritic adaptation at 1 Hz and 5 Hz. The adaptation time for the 5 Hz was much faster than the 1 Hz training. Moreover, the effect of adaptation was more pronounced over time at times. This provides signals that adaptation may happen in under a minute.
The team then turned their attention on examining the possibility of high-velocity adaptation by running a second experiment where the neuron was stimulated with two different amplitudes. The researchers discovered that the adaptation process was much faster when the training frequency was increased. Then they used their brain-inspired findings on artificial neural networks.
The researchers tested an artificial neural network using a dataset of handwritten digits (MNIST) to understand the potential impact of time-dependent adaptation steps. They compared the biological-inspired accelerated learning for the MNIST database with a current learning method.
The results showed that the brain-inspired fast-learning method performed much better than existing machine language strategies for small sets of training examples. Their findings may help further advance applied artificial intelligence where there is a need for high-velocity decisions with only limited training examples. The team successfully bridged experimental neuroscience with machine learning, thus opening the possibility for further advancement in areas such as network optimization, robotics, and decision-making in the future.
Copyright ©2020 Cami Rosso. All rights reserved.