Photo of A biologically plausible technique for training a neural net

A biologically plausible technique for training a neural net

by Kurt Kleiner News Learning in Machines & Brains 17.05.2017

Artificial intelligence researchers have made advances in recent years by designing neural networks that are based on the structure of the human brain. But their work isn’t just leading to better computers – it is also leading to better understanding about how the neural networks in our own brains work.

Learning in Machines & Brains Co-Director Yoshua Bengio and his student Benjamin Scellier, both of the Université de Montréal, invented a new way of training artificial neural networks that might help theoretical neuroscientists figure out how natural neural networks learn and correct errors.

They call the technique “equilibrium propagation,” and it is an alternative to a widely used technique for training neural networks called backpropagation.

To train an artificial neural network with backpropagation you first present it with an input, propagate the signal forward in the network, then examine the output and compare it to the output you would ideally like to have gotten. The difference between the two is called the error. In the second step you “backpropagate” the error through the network, making adjustments to individual neurons and synapses in an attempt to get the output closer to the ideal. By repeating the process many times you gradually reduce the error in the neural network, bringing it closer to the output you want.

These illustrations represent a network in which equilibrium propagation works, left, with many recurrent and symmetric connections among nodes. On the right is a layered network of the kind in which backpropagation is useful.

But the way backpropagation works in artificial neural networks has never seemed biologically plausible, says Scellier. One reason among many others is that it requires a special computational circuit for the backpropagation of errors, which seems unlikely to have arisen in an evolved organism, based on current knowledge in neuroscience.

The new equilibrium propagation technique is possible using a single circuit, and a single type of calculation, says Scellier. “Our model requires only one type of neuronal dynamics to perform both inference and backpropagation of errors. The computations executed in the network are based on a standard neuron model and a standard form of synaptic plasticity.” That means it’s more likely to be similar to an actual process that evolved in the brain, and could provide the beginnings of an answer to how biological neural circuits learn.

That’s one of the major goals of the work, Scellier says.

“Today, the gap between neuroscience and the neural networks used in artificial intelligence is pretty big. Our approach is to start from a model with good machine learning properties, and gradually add details that make the model more biologically realistic,” he says.

The next step will be for neuroscientists to design experiments to see if the brain itself uses similar techniques.

The paper “Equilibrium Propagation: Bridging the Gap Between Energy-Based Models and Backpropagation” was published in Frontiers in Computational Neuroscience.

Related Ideas

Child & Brain Development

Nurse home visits may leave genetic marks on children

In the 1970s, a pediatrician developed a program in New York with an ambitious goal: to help children of teenage...

Research Brief | Child & Brain Development

Poverty and neglect are bad for the brain — but could lasting effects be avoided?

Childhood adversity leaves tangible and long-lasting marks on the developing brain that could lead to lifelong health and psychological problems....

News | Child & Brain Development

An epigenetic key to unlock behaviour change

When it comes to behaviour, researchers have moved beyond the “nature versus nurture” debate. It’s understood that genes and environment...

News | Child & Brain Development

Study finds potential ‘master key’ to brain plasticity

Why is it that even well into adulthood we prefer the music we listened to as young adults? And how...

News | Learning in Machines & Brains

A ‘surprisingly popular’ way to extract group wisdom

Is Philadelphia the capital of Pennsylvania? The answer may surprise you. Even more important, just how many people are surprised...

Video | Institutions, Organizations & Growth

 Institutions & Societal Prosperity

CIFAR has partnered with Research2Reality to capture insights from a variety of our leading fellows. In this video, Daron Acemoglu,...