Jürgen Schmidhuber, Director of the Swiss AI Lab IDSIA Deep Learning
We're very excited to have one of the world experts in this field at our first meetup. The recent resurrection of multi-layer neural networks is generating a lot of interest currently, with deep learning appearing on the New York Times front page, and big companies like Google and Facebook hunting for the experts in this field. Jürgen's talk will shed more light on how deep learning methods work, and why they work.
Talk Outline: - The history of backpropagation 1960-1981 and beyond - The fundamental Deep Learning (DL) problem of gradient-based neural networks (NNs) (1991) - A deep unsupervised stack of recurrent NNs (RNNs) to overcome the DL problem (History Compressor, 1991) - Purely supervised deep Long Short-Term Memory RNNs (LSTM) since 1995 - How LSTM set standards in speech & connected handwriting recognition in the new millennium - How (in 2010) deep GPU-based backprop (3-5 decades old) + training pattern deformations (2 decades old) broke the MNIST benchmark record - The history of feedforward max-pooling convolutional nets (MPCNNs, 1979, 1989, 1999, 2007, 2011, ...) - How GPU-based MPCNNs (since 2011) have won many contests where feedforward NNs are applicable: image recognition and segmentation, object detection ... - How NN-based planning robots won the RoboCup in the fast league (2004) - Deep Reinforcement Learning through Compressed NN Search applied to RNN controllers that learn to process raw video input (2013)
1st Züri Machine Learning Meetup 25th February 2014, ETH Zurich, Switzerland