As expected! Jürgen has stood up again to oppose Hinton’s Nobel Prize, and Nature has also criticized the nomination process for being opaque.
Translation: Editors: Dan Jiang, Jia Qi.
Engaging in a public debate with Schmidhuber on academic credibility is not advisable, as it will only boost his ego, and he is willing to spend a lot of time and energy to discredit what he sees as opponents. He even resorts to tactics like using multiple aliases in Wikipedia to make it appear as if others agree with his views. His webpage about Alan Turing is a good example of how he tries to diminish the contributions of others.
Although I have the best judgment, I feel that I cannot completely ignore his accusations, so I will respond only once. I never claimed that I invented backpropagation. David Rumelhart independently invented it long after other people in other fields had already invented it. Indeed, when we first published our paper, we were not aware of this history, so we did not cite the previous inventors. What I claimed is that I clearly demonstrated that backpropagation can learn interesting internal representations, which is why it became so popular. I did this by forcing neural networks to learn vector representations of words, so that it could predict the next word in the sequence based on the vector representation of the previous word. It was this example that convinced the reviewers at Nature to publish the paper in 1986.
Indeed, many people in the media have said that I invented backpropagation, and I have spent a lot of time correcting them. The following is an excerpt from the book "Architects of Intelligence" by Michael Ford, published in 2018:
Before David Rumelhart, many different people invented different versions of backpropagation. They were mainly independent inventions, and I feel that my contribution is too exaggerated. I saw someone in the media claiming that I invented backpropagation, which is completely wrong. This is a rare example of a scholar thinking they have been given too much credit for something! My main contribution was demonstrating how to use it to learn distributed representations, so I want to clarify this.
Perhaps Jürgen wants to clarify who invented the LSTM?
Early bird sale | "Development and Application Practice of Large AI Models at the Edge" Technical Forum·Shanghai
In the first year of the outbreak of edge AI, how can companies seize the opportunity of large-scale edge models, explore industry applications, and achieve business growth? How can R&D personnel enhance their skills to apply large models to endpoint devices such as PCs, smartphones, home appliances, smart wearables, robots, educational hardware, etc.?
On October 26, the forum gathered representatives from leading companies in the field of edge-side large model technology and applications. Through technical reports, practical applications, operational exercises, and other activities, from theory to practical implementation, we will help you comprehensively grasp the deployment and application of edge-side large models, and embark on the journey of upgrading AI technology.
Take action now, scan the code and enjoy the limited-time early bird discount immediately!
© THE END