Recurrent Neural Networks

author: Yoshua Bengio, Department of Computer Science and Operations Research, University of Montreal
published: Aug. 23, 2016,   recorded: August 2016,   views: 42276


Related Open Educational Resources

Related content

Report a problem or upload files

If you have found a problem with this lecture or would like to send us extra material, articles, exercises, etc., please use our ticket system to describe your request and upload the data.
Enter your e-mail into the 'Cc' field, and we will keep you updated with your request's status.
Lecture popularity: You need to login to cast your vote.


This lecture will cover recurrent neural networks, the key ingredient in the deep learning toolbox for handling sequential computation and modelling sequences. It will start by explaining how gradients can be computed (by considering the time-unfolded graph) and how different architectures can be designed to summarize a sequence, generate a sequence by ancestral sampling in a fully-observed directed model, or learn to map a vector to a sequence, a sequence to a sequence (of the same or different length) or a sequence to a vector. The issue of long-term dependencies, why it arises, and what has been proposed to alleviate it will be core subject of the discussion in this lecture. This includes changes in the architecture and initialization, as well as how to properly characterize the architecture in terms of recurrent or feedforward depth and its ability to create shortcuts or fast propagation of gradients in the unfolded graph. Open questions regarding the limitations of training by maximum likelihood (teacher forcing) and ideas towards towards making learning online (not requiring backprop through time) will also be discussed.

See Also:

Download slides icon Download slides: deeplearning2016_bengio_neural_networks_01.pdf (23.4┬áMB)

Help icon Streaming Video Help

Link this page

Would you like to put a link to this lecture on your homepage?
Go ahead! Copy the HTML snippet !

Reviews and comments:

Comment1 Kamal Ali, August 28, 2016 at 1:20 a.m.:

Prof Bengio,
next year please ask Audiovisual people to put the camera on all the great stuff you have on your slides rather than slavishly videotaping you! I would have loved to have seen the actual slides. Or maybe a double , where you are in an inset and the slides are also visible.

Write your own review or comment:

make sure you have javascript enabled or clear this field: