Go Back

Photo of Geoffrey Hinton

Geoffrey Hinton Computer scientist

Geoffrey Hinton investigates how neural networks can be used for learning, memory, perception and symbol processing. He was one of the researchers who introduced the back-propagation algorithm that has been widely used for practical applications in deep learning. His other contributions to neural network research include Boltzmann machines, distributed representations, time-delay neural nets, mixtures of experts, Helmholtz machines and products of experts. His current main interest is in unsupervised learning procedures for neural networks with rich sensory input.


IEEE Frank Rosenblatt Medal, 2014.

Killam Prize in Engineering, 2012.

Gerhard Herzberg Gold Medal for Science and Engineering, 2011.

IJCAI Award for Research Excellence, 2005.

David E. Rumelhart Prize, 2001.

Relevant Publications

D.E. Rumelhart et al, "Parallel distributed processing," IEEE, vol. 1, pp. 354-362, 1988.



Advisor Learning in Machines & Brains


Google, University of TorontoDepartment of Computer Science


PhD (Artificial Intelligence) Edinburgh University

BA (Experimental Psychology) Cambridge University



Ideas Related to Geoffrey Hinton


CIFAR names Geoffrey Hinton a Distinguished Fellow

CIFAR has awarded Prof. Geoffrey Hinton, the former Director of CIFAR’s Learning in Machines & Brains program (formerly known as Neural...

Learning in Machines & Brains | Recommended

CIFAR fellows, advisor sum up the state of deep learning in Nature

Deep learning has vastly enhanced computer recognition of speech, visuals and objects, and has become an important tool for genomics...


New CIFAR fellows and advisors

CIFAR welcomes new fellows joining the community in the Institutions, Organizations & Growth and Quantum Materials programs. Christopher Wiebe (University...

Learning in Machines & Brains | Feature

Deep thinking – Making machines better learners

The journey towards creating artificial intel­ligence has been slower and more difficult than early pioneers predicted. Modern computers are truly...