Loading...

Alex Rubinsteyn: Python Libraries for Deep Learning with Sequences

24,534 views

Loading...

Loading...

Transcript

The interactive transcript could not be loaded.

Loading...

Loading...

Rating is available when the video has been rented.
This feature is not available right now. Please try again later.
Published on Dec 4, 2015

PyData NYC 2015

Recurrent Neural Networks extend the applicability of deep learning into many different problems involving sequential data, such as translation, captioning, summarization, and timeseries prediction/classification. This talk will give a brief overview of how Recurrent Neural Networks work before showing you how to create and train them in Python.

Recurrent Neural Networks are a powerful class of statistical models which allow neural networks to deal with sequential data. They have recently become a powerful component within the Deep Learning community for achieving state of the art performance on tasks such as captioning, translation, and summarization. This talk will provide a brief introduction to the terminology of recurrent neural networks and then focus on how to create and train them from Python. I will show non-trivial network implementations using the most popular Python deep learning libraries (Keras, Lasagne, Blocks, Chainer) and compare their performance and extensibility.

Slides available here: http://www.slideshare.net/hawflake/py...

Comments are disabled for this video.
When autoplay is enabled, a suggested video will automatically play next.

Up next


to add this to Watch Later

Add to

Loading playlists...