Photo of A biologically plausible technique for training a neural net

A biologically plausible technique for training a neural net

by Kurt Kleiner News Learning in Machines & Brains 17.05.2017

Artificial intelligence researchers have made advances in recent years by designing neural networks that are based on the structure of the human brain. But their work isn’t just leading to better computers – it is also leading to better understanding about how the neural networks in our own brains work.

Learning in Machines & Brains Co-Director Yoshua Bengio and his student Benjamin Scellier, both of the Université de Montréal, invented a new way of training artificial neural networks that might help theoretical neuroscientists figure out how natural neural networks learn and correct errors.

They call the technique “equilibrium propagation,” and it is an alternative to a widely used technique for training neural networks called backpropagation.

To train an artificial neural network with backpropagation you first present it with an input, propagate the signal forward in the network, then examine the output and compare it to the output you would ideally like to have gotten. The difference between the two is called the error. In the second step you “backpropagate” the error through the network, making adjustments to individual neurons and synapses in an attempt to get the output closer to the ideal. By repeating the process many times you gradually reduce the error in the neural network, bringing it closer to the output you want.

These illustrations represent a network in which equilibrium propagation works, left, with many recurrent and symmetric connections among nodes. On the right is a layered network of the kind in which backpropagation is useful.

But the way backpropagation works in artificial neural networks has never seemed biologically plausible, says Scellier. One reason among many others is that it requires a special computational circuit for the backpropagation of errors, which seems unlikely to have arisen in an evolved organism, based on current knowledge in neuroscience.

The new equilibrium propagation technique is possible using a single circuit, and a single type of calculation, says Scellier. “Our model requires only one type of neuronal dynamics to perform both inference and backpropagation of errors. The computations executed in the network are based on a standard neuron model and a standard form of synaptic plasticity.” That means it’s more likely to be similar to an actual process that evolved in the brain, and could provide the beginnings of an answer to how biological neural circuits learn.

That’s one of the major goals of the work, Scellier says.

“Today, the gap between neuroscience and the neural networks used in artificial intelligence is pretty big. Our approach is to start from a model with good machine learning properties, and gradually add details that make the model more biologically realistic,” he says.

The next step will be for neuroscientists to design experiments to see if the brain itself uses similar techniques.


The paper “Equilibrium Propagation: Bridging the Gap Between Energy-Based Models and Backpropagation” was published in Frontiers in Computational Neuroscience.

Related Ideas

News | Learning in Machines & Brains

A biologically plausible technique for training a neural net

Artificial intelligence researchers have made advances in recent years by designing neural networks that are based on the structure of...

Video | Bio-inspired Solar Energy | Institutions, Organizations & Growth | Learning in Machines & Brains | CIFAR Azrieli Global Scholars

Introducing the Future of Research

Three members of the inaugural group of CIFAR Azrieli Global Scholars spoke to an invited audience about their work and...

News | Learning in Machines & Brains

Government renews, increases support for CIFAR, invests in AI Strategy

In the federal budget last month, there were two pieces of good news for CIFAR. First, Ottawa renewed and increased...

News | Learning in Machines & Brains

A ‘surprisingly popular’ way to extract group wisdom

Is Philadelphia the capital of Pennsylvania? The answer may surprise you. Even more important, just how many people are surprised...

News | Learning in Machines & Brains

Learning algorithms find a new music teacher

Computers have a new music teacher to help them master the likes of Johann Sebastian Bach and Wolfgang Amadeus Mozart...