IT esay, IT is! l Eleventh, the godfather of deep learning Jeffrey Hinton

 Have you heard of Jeffrey Hinton? In this time, we will look at artificial intelligence, which is becoming more popular these days. Let's take a look at Jeffrey Hinton

➤ The Godfather of Deep Learning, who saved artificial neural networks 'Jeffrey Hinton’

Seymour Peppert, who represents Symbolic Artificial Intelligence, has demonstrated with his teacher Marvin Minski in a book called "Perceptron" in 1969 in various mathematical ways that Rosenblatt's simple perceptron can’t solve the XOR problem. As a result of this book, neural network research, which represents connective artificial intelligence, almost all the grants are cut off, and The first difficult period of artificial intelligence has come.

 ▲ Professor, Jaffrey Hinton (Image source:

In fact, Seymour Pepper, in his essay in 1988, compared himself to a hunter who was sent by the Queen to kill Snow White, and in 1969 recalled criticism of neural networks through the publication of Perceptron books as a joke.

Jeffrey Hinton is the one who saved the neural network that had been in a deep bog since then, and appeared as a solver every time the neural network fell into a crisis and made the current deep learning. Jeffrey Hinton, a descendant of George Bull, the founder of British symbolic logic, was interested in the brain for the first time he heard a story about rat’s brain research from a friend who was studying better than he was in high school. Later he enrolled at Kings College in Cambridge, first in physiology and physics, then in philosophy and again in psychology. I decided to go to Professor Higgins at Edinburgh University and achieved Ph.D. in artificial neural network.

But Hinton could not get a proper seat in Britain. He then moved to the United States and continued his neural network studies at the University of California, Carnegie Mellon. In an interview recently with Andrew Ng, Hinton recalled that neurological research was not welcomed in Britain at the time, but in California, in America, it was very good that Professor Rumel Hart encouraged him.


➤ Jeffrey Hinton and Neural Network Research

▲ John Hopfield (Image source:

As he studies in the United States, Jeffrey Hinton meets fateful professors, one was Professor John Hopfield, who was opening a new horizon for the study of interconnected neural networks by introducing energy concepts into neural networks for the first time, Professor David Rumel Hart. Jeffrey Hinton proposed a "Boltzmann machine" in 1984 with Terry Seyynowski, a student of Hopfield's disciples, and a Hopfield network. In 1986, he also got an opportunity to be participated in a monumental article by Professor Rumel Hart, who revived the "back-propagation algorithm”. Since then, Jeffrey Hinton has emerged as a central figure in neural network research.

▲ Artificial neural networks

Artificial neural networks are connected by groups of nodes, which are similar to the networks of vast neurons in the brain. In the figure above, each circular node represents an artificial neuron, and an arrow represents input from one neuron output to another neuron.


▲ Formation of research group of Yan Rekun and Yosua Benjio (Image source:

At the time, however, the Reagan administration in the United States was keen on expanding the Cold War and armament, and Jeffrey Hinton was invited by the Canadian Institute of Advanced Studies to move to the University of Toronto, Canada, because he did not want artificial neural network research to be used for military purposes. While at the University of Toronto, he formed the Deep Learning Research Group with Yanne Benjio (the University of New York ) and Yanne Benjio (the University of New York). He has been working as a godfather of deep learning research to cultivate many disciples and contributed to many achievements.


Writer Introduction Kwon Kun-woo


Popular posts from this blog

[2018 Global Industry Revolution Conference l EXEM 4.0 People-centered, Future innovation]

EXEM at AWS Summit | Seoul 2018