YouTube home #HappyBirthdayYouTube


Jürgen Schmidhuber at AGI-2011: Fast Deep/Recurrent Nets for AGI Vision





The interactive transcript could not be loaded.



Rating is available when the video has been rented.
This feature is not available right now. Please try again later.
Uploaded on Oct 17, 2011

The Fourth Conference on Artificial General Intelligence
Mountain View, California, USA
August 3-6, 2011

Jürgen Schmidhuber's short talk on fast deep neural networks at AGI 2011 at Google Headquarters, CA.
Co-authors: Dan Ciresan, Ueli Meier, Jonathan Masci, Alex Graves.

The deep / recurrent neural networks of Schmidhuber's team keep winning important visual pattern recognition competitions, and are starting to achieve human-competitive results:

9. August 2011: IJCNN 2011 on-site Traffic Sign Recognition Competition (0.56% error rate, nearly three times better than 2nd best algorithm - the only method outperforming humans)
8. June 2011: ICDAR 2011 offline Chinese Handwriting Recognition Competition (1st & 2nd rank)
7. MNIST Handwritten Digit Recognition Benchmark (perhaps the most famous machine learning benchmark). New record (0.35% error rate) in 2010, improved to 0.31% in March 2011, then 0.27% for ICDAR 2011
6. NORB Object Recognition Benchmark. New record (2.53% error rate) in 2011
5. CIFAR-10 Object Recognition Benchmark. New records in 2011, now down to 12% error rate
4. January 2011: Online German Traffic Sign Recognition Contest (1st & 2nd rank; 1.02% error rate)
3. ICDAR 2009 Arabic Connected Handwriting Competition, like the others below won by LSTM recurrent nets (deep by nature).
2. ICDAR 2009 Handwritten Farsi/Arabic Character Recognition Competition
1. ICDAR 2009 French Connected Handwriting Competition based on data from the RIMES campaign

Overview sites with more information and scientific papers:

Computer vision with fast deep / recurrent neural networks:
Handwriting recognition with fast deep / recurrent neural nets:
Formal Theory of Fun & Creativity & Intrinsic Motivation
Artificial curiosity - how to build artificial scientists and artists:
Optimal Universal Artificial Intelligence:
Self-referential Gödel Machines as universal problem solvers:
Artificial Evolution:
Unsupervised Learning:
Hierarchical Learning:
Reinforcement Learning:
Robot Learning:
Source code of machine learning algorithms at Pybrain:
Home page:
What's new:


When autoplay is enabled, a suggested video will automatically play next.

Up Next

to add this to Watch Later

Add to

Loading playlists...