Quick Impression on deeplearning.ai’s “Heroes of Deep Learning”. This time is the interview of Prof. Yoshua Bengio. As always, don’t post any copyrighted material here at the forum!
* Out of the ‘Canadian Mafia’, Prof Bengio is perhaps the less known among the three. Prof. Hinton and Prof. Lecun have their own courses, and as you know they work for Google and Facebook respectively. Whereas Prof. Bengio does work for MS, the role is more of a consultant.
* You may know him as one of the coauthors of the book “Deep Learning”. But then again, who really understand that book, especially part III?
* Whereas Prof. Hinton strikes me as an eccentric polymath, Prof. Bengio is more a conventional scholar. He was influenced by Hinton in his early study of AI which was mostly expert-system based.
* That explains why everyone seems to leave his interview out, which I found it very intersting.
* He named several of his group’s contributions: most of what he named was all fundamental results. Like Glorot and Bengio 2010 on now widely called Xavier’s initialization or attention in machine translation, his early work in language model using neural network, of course, the GAN from GoodFellow. All are more technical results. But once you think about these ideas, they are about understanding, rather than trying to beat the current records.
* Then he say few things about early deep learning researcher which surprised me: First is on depth. As it turns out, the benefit of depth was not as clear early in 2000s. That’s why when I graduated in my Master (2003), I never heard of the revival of neural network.
* And then there is the doubt no using ReLU, which is the current day staple of convnet. But the reason makes so much sense – ReLU is not smooth on all points of R. So would that causes a problem. Many one who know some calculus would doubt rationally.
* His idea on learning deep learning is also quite on point – he believe you can learn DL in 5-6 months if you had the right training – i.e. good computer science and Math education. Then you can just pick up DL by taking courses and reading proceedings from ICML.
* Finally, it is his current research on the fusion of neural networks and neuroscience. I found this part fascinating. Would backprop really used in brain a swell?
That’s what I have. Hope you enjoy!