Skip to main content

Verified by Psychology Today

New Synthetic Learning May Inspire Future Neuromorphic AI

Scientists mimic aspect of human intelligence using quantum synthetic matter.

Hainguynrp/Pixabay
Source: Hainguynrp/Pixabay

A new peer-reviewed study published this week in PNAS shows how learning, an important aspect of human intelligence, can be recreated in synthetic matter—a discovery that can lead to new forms of artificial intelligence (AI) and neuromorphic computing in the future.

“Habituation and sensitization (nonassociative learning) are among the most fundamental forms of learning and memory behavior present in organisms that enable adaptation and learning in dynamic environments,” wrote the study authors affiliated with Rutgers University, Purdue University, the University of Georgia, and the Argonne National Laboratory. “Emulating such features of intelligence found in nature in the solid state can serve as inspiration for algorithmic simulations in artificial neural networks and potential use in neuromorphic computing.”

Human intelligence and the biological brain have long served as an inspiration for the architecture and design of artificial intelligence machine learning. Neuromorphic computing, also referred to as neuromorphic engineering, is the growing field of study that seeks to reproduce aspects of human cognition in modern electronic devices such as computers. The aim of neuromorphic computing is to overcome the limitations of von Neumann architecture with a solution that more closely mimics the biological brain.

The basis for most computing hardware today is known as von Neumann architecture. In 1945 Hungarian-born American mathematician John von Neumann (1903-1957) published a computer architecture design consisting of input and outputs, a memory unit, and a central processing unit (CPU) that contains a control unit (CU), an arithmetic and logic unit (ALU), and a variety of registers. The drawbacks to the von Neumann architecture are that it is challenging to integrate long-term memory storage and it requires a lot of energy to transmit data between the processing and memory units.

The von Neumann architecture is very different from how a biological brain works where computation and memory are highly distributed among its roughly 80 billion neurons which act as simple processing units, and memory is not centrally located, but instead involves several areas of the brain. In neuroanatomy, the prefrontal cortex, amygdala, hippocampus, and cerebellum are principal areas out of the many parts of the brain that are associated with memory.

Machine learning is a method of enabling computers to “learn” without any hard-coding or explicit programming. Deep learning is a subset of machine learning. The artificial neural network architecture for deep learning with its network of artificial neurons (nodes), is an example of brain-inspired design. Since the design of AI machine learning is inspired in part by the biological brain, the von Neumann architecture presents computational challenges to deep learning.

To demonstrate learning in synthetic matter, for this study the researchers used a quantum material with properties that are not fully explained by classical physics called nickel oxide (NiO) with a Mott insulator which belongs to a class of materials that when measured behave as insulators even though their band structure makes it seem like they would conduct electricity instead.

Using gases to stimulate the quantum material at and above room temperature, the scientists found that nickel oxide had habituation and sensitization like that of Aplysia, a genus of medium-sized to giant sea slugs.

“Similar to biological species such as Aplysia, habituation and sensitization of NiO possess time-dependent plasticity relying on both strength and time interval between stimuli,” the researchers reported. “A combination of experimental approaches and first-principles calculations reveals that such learning behavior of NiO results from dynamic modulation of its defect and electronic structure.”

The scientists theorize that if a quantum material could recreate habituation and sensitization forms of learning, then they could potentially develop AI into the hardware directly. This would lower energy costs, while increasing overall computational efficiency and performance.

“An artificial neural network model inspired by such non-associative learning is simulated to show advantages for an unsupervised clustering task in accuracy and reducing catastrophic interference, which could help mitigate the stability–plasticity dilemma,” the researchers concluded. “Mott insulators can therefore serve as building blocks to examine learning behavior noted in biology and inspire new learning algorithms for artificial intelligence.”

Copyright © 2021 Cami Rosso All rights reserved.

advertisement