Photo of The brain’s memory system points to more efficient computing methods

The brain’s memory system points to more efficient computing methods

by Lindsay Jolivet News Learning in Machines & Brains 27.01.2016

The brain could have 10 times as much storage capacity as we originally thought, according to new findings about the structure of the connections between neurons. The research suggests our brains can hold a petabyte of information – roughly the amount of data you would need to stream Netflix for 114 years.

In addition to swelling our heads, the findings give us a better idea of the brain’s capabilities, and could show scientists how to build computers that have more power but use less energy.

The new study was led by CIFAR Advisor Terrence Sejnowski and Thomas Bartol at the Salk Institute and Kristen Harris at the University of Texas at Austin. It showed that synapses — the web of connections between neurons in the brain that allow us to form and access memories — can be fine-tuned with surprising accuracy.

Whether a signal will pass from one neuron to another depends in part on the size of the synapse involved. Previously, researchers had only categorized synapses into a few sizes. But the new research shows that synapses can adjust among at least 26 different sizes with surprising accuracy.

The findings suggest that synapses are constantly adjusting their sizes, shrinking or growing in response to the signals they have received before, sometimes as often as every two minutes, while maintaining a high degree of precision.

The researchers found that, rather than just a few sizes of synapses, there are actually 26 discrete sizes that can change over a span of a few minutes, meaning that the brain has a far greater capacity for storing information.  Credit: Salk Institute
The researchers found that, rather than just a few sizes of synapses, there are actually 26 discrete sizes that can change over a span of a few minutes, meaning that the brain has a far greater capacity for storing information.  Credit: Salk Institute

The study could help solve the mystery of the apparent inefficiency of synapses, most of which successfully transmit a signal only 10 to 20 per cent of the time. The new research suggests that because signals from thousands of input synapses converge on a neuron, the unreliability of all of the signals averages out into a reliable signal. Having synapses that are not always active could conserve a lot of energy.

“The brain has to do all its computing on 20 watts of power — a dim light bulb,” says Sejnowski. Given that, he says it makes sense for each synapse to do little work, with a low probability of activating. “If you’re using probabilities, the savings you get are enormous.”

The researchers made the discovery, published in eLife, by creating a 3D computer model of a tiny section of the memory centre, the hippocampus, in a rat’s brain. They made the most precise measurement yet to compare how different two synapses onto a neuron could be that received the same inputs, and calculated that given the number of different sizes, each synapse could store about 4.7 bits of memory. Scaled up to the number of synapses in a human brain, that equals one petabyte, and one powerful biological machine.

“Ultimately, nature evolved a very complex device, the brain, and it looks as if we may now understand how it’s able to function so well with such unreliable synapses,” Sejnowski says.

The study shows how the brain benefits from redundancies. Not every synapse needs to work every time for us to gather and store memories, and it seems this delicate balance makes it much more energy efficient.

Sejnowski says this understanding provides a path for building machine learning approaches that can handle huge amounts of data with less computer power and higher accuracy. “It’s something we’ve been searching for,” he says. “As chips add more and more transistors, they have more flaws.”

Currently, one misfire in a computer memory could lead the whole system to fail. If computers could incorporate the brain’s redundancies, with each artificial synapse essentially flipping a coin to decide if it will transmit a signal or not, we could greatly improve computing power. CIFAR fellows such as Roland Memisevic and Yoshua Bengio have already begun exploring the possibilities.

“This is a whole new computer architecture that will ultimately be translated into new chips and new operating systems that are based on probability rather than a perfect, deterministic digital computer operation,” Sejnowski says.

He adds that training artificial neural networks can inform how we study the brain. “It’s interesting that we’ve reached a point where brain theory is interacting very closely with computer theory.”

Image above: This computer image shows two points where two neurons have formed a connection. The translucent black thread is the axon of one neuron, and the yellow another neuron. Credit: Salk Institute and UT Austin

Leave a Comment

Your email address will not be published.

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>

Related Ideas

Announcement | News

CIFAR Research Workshops: Call for Proposals

For more than three decades, CIFAR’s global research programs have connected many of the world’s best minds – across borders...

News | Learning in Machines & Brains

A biologically plausible technique for training a neural net

Artificial intelligence researchers have made advances in recent years by designing neural networks that are based on the structure of...

News | Azrieli Program in Brain, Mind & Consciousness

Proof for a psychedelic state of mind

Those famous musicians and artists who touted psychedelic drugs as a key component of their creativity may have been onto...

Video | Bio-inspired Solar Energy | Institutions, Organizations & Growth | Learning in Machines & Brains | CIFAR Azrieli Global Scholars

Introducing the Future of Research

Three members of the inaugural group of CIFAR Azrieli Global Scholars spoke to an invited audience about their work and...

News | Learning in Machines & Brains

Government renews, increases support for CIFAR, invests in AI Strategy

In the federal budget last month, there were two pieces of good news for CIFAR. First, Ottawa renewed and increased...

News | Azrieli Program in Brain, Mind & Consciousness

Magnetic Stimulation Boosts Memory

A technique that uses magnetic pulses to stimulate nerve cells in the brain was used in the lab to improve...