New Twist on AI Evolutionary Algorithms in Neuroscience
IBM’s biologically motivated machine learning accelerates brain research.
Posted May 24, 2019
At the intersection of neuroscience and artificial intelligence (AI) is an alternative approach to deep learning. Evolutionary algorithms (EA) are a subset of evolutionary computation—algorithms that mimic biological evolution to solve complex problems. Published this week on Tuesday in Cell Reports, IBM researchers took an innovative approach using evolutionary algorithms to create a state-of-the-art cloud-based neuroscience model for studying neurodegenerative disorders.
The origins of artificial intelligence goes back to the 1950s. The recent global resurrection of AI from its hibernation is largely due to advances in machine learning pattern recognition, namely deep learning. Deep learning, modeled loosely on the biological brain with layers of neural networks, tend to be single-purpose point-solutions that require extensive training with massive data sets, and are customized for a specific environment. In contrast, evolutionary algorithms solves a problem based on criteria set in the “fitness” function—thus requiring little to no data.
The different classes of evolutionary algorithms include genetic algorithms, evolution strategies, differential evolution and estimation of distribution algorithms. What these classes have in common is the process of evolution. The EA process of evolution involves generating, usually randomly, populations of search points, also called agents, chromosomes, candidate solutions or individuals. These populations of search points are put through “variation” operations and “selection” through multiple generations. The concept of a variation operation is similar to biological mutation and recombination processes.
The “fitness” of each search point is calculated after each iteration—the ones with the “strongest” (higher objective values) are kept, the “weakest” (lower objective values) are removed from the population of search points. Hence, the population of search points “evolve” over generations to produce the optimal solution to the problem. The “fittest” variation survives.
Evolutionary algorithms are distributed in nature, making it well matched for cloud-based or massively parallel multi-core processing. In this neuroscience study focused on Huntington’s disease, the researchers used state-of-the-art non-dominated sorting differential evolution (NSDE) algorithm hosted on IBM Cloud.
“We introduced a ‘soft thresholding’ of the error function coupled with a neighborhood penalty to prevent systematic bias due to targeting exact feature values,” said James R. Kozloski, neuroscientist and IBM Master Inventor, who worked on the research study.
”We used this modified error of the non-dominated sorting differential evolution (NSDE) framework and imposed a penalty based on a measure of ‘crowdedness’ in feature space of previously selected 0 error models thus biasing the algorithm to evenly cover feature space in the 0 error region,” Kozloski explained. “This allowed us to create models of the full range of parameters possible to fit the data, rather than just a single model.”
“This ends up helping the models to generalize well, because we're encouraging the algorithm to find a region of parameter space that has the capability of producing models that respond like any of the neurons recorded during the experiments,” said the master algorithm designer for the research—Tim Rumbell, PhD, and Computational Neuroscientist at IBM Thomas J. Watson Research Center.
“Evolutionary algorithms (EAs) have been used quite often previously to search parameter space of neuron models, and I used NSDE in one of my previous publications,” said Rumbell.
Rumbell, along with researchers Danel Draguljić, Aniruddha Yadav, Patrick R. Hof, Jennifer I. Luebke and Christina M. Weaver, used evolutionary algorithms in a prior study to model the ion channel conductance and kinetics of pyramidal neurons in monkeys that was published in the Journal of Computational Neuroscience in August 2016.
Some open source frameworks used for evolutionary algorithms include Distributed Evolutionary Algorithms (DEAP) in Python, Evolutionary Computation in Java (ECJ), Evolving Objects in C++, EvA2 in Java, HeuristicLab C#, MOEA Framework in Java, and OpenBEAGLE in C++.
Neuroscience is an inherently complex field of science where artificial intelligence is accelerating breakthrough discoveries. Evolutionary algorithms present a flexible, adaptive alternative to AI deep learning, and are currently being used in computational neuroscience to accelerate the scientific understanding of the human brain.
Copyright © 2019 Cami Rosso All rights reserved.
Octeau, J. Christopher, Gangwani, Mohitkumar R., Allam, Sushmita L., Tran, Duy, Huang, Shuhan, Hoang-Trong, Tuan M., Golshani, Peyman, Rumbell, Timothy H., Kozloski, James R., Khakh, Baljit S. “Transient, Consequential Increases in Extracellular Potassium Ions Accompany Channelrhodopsin2 Excitation.” Cell Reports. May 21, 2019.
Corne, David W., Lones, Michael A., “Evolutionary Algorithms.” arXiv. May 2018.
Rumbell, Timothy H., Draguljić, Danel, Yadav, Aniruddha, Hof, Patrick R., Luebke, Jennifer I., Weaver, Christina M. “Automated evolutionary optimization of ion channel conductances and kinetics in models of young and aged rhesus monkey pyramidal neurons.” Journal of Computational Neuroscience. August 2016.
Deb K., Agrawal S., Pratap A., Meyarivan T. (2000) A Fast Elitist Non-dominated Sorting Genetic Algorithm for Multi-objective Optimization: NSGA-II. In: Schoenauer M. et al. (eds) Parallel Problem Solving from Nature PPSN VI. PPSN 2000. Lecture Notes in Computer Science, vol 1917. Springer, Berlin, Heidelberg
Sipper, Moshe, Olson, Randal S., Moore, Jason H. “Evolutionary computation: the next major transition of artificial intelligence?” BioData Mining. July 29, 2017.