Deep NLP Recurrent Neural Networks thumbnail
Pause
Mute
Subtitles not available
Playback speed
0.25
0.5
0.75
1
1.25
1.5
1.75
2
Full screen

Deep NLP Recurrent Neural Networks

Published on Sep 13, 201511793 Views

Related categories

Chapter list

Overview:  Today00:00
Language  Models00:53
Recurrent Neural Networks!10:10
Recurrent Neural Network language model - 212:13
Recurrent Neural Network language model - 312:13
Objective function for language models12:13
Why is the vanishing gradient a problem?15:33
The vanishing gradient problem for language models15:51
Trick for exploding gradient: clipping trick16:54
Gradient clipping intuition17:34
For vanishing gradients: Initialization + ReLus20:47
Perplexity Results25:54
Problem: Softmax is huge and slow26:19
Our last implementation trick27:38
Training RNNs is hard27:45
The vanishing/exploding gradient problem27:54
Recurrent Neural Network language model - 128:57
Sequence modeling for other tasks30:14
Opinion Mining with Deep Recurrent Nets31:37
Example Annotation32:16
Approach: Recurrent Neural Network36:33
Bidirectional RNNs36:52
Deep Bidirectional RNNs36:57
Data41:10
Evaluation41:29
Machine Translation (MT)43:27
Deep learning to the rescue!?44:24
MT with RNNs - Simplest Model45:14
RNN Translation Model Extensions - 146:24
Different picture, same idea48:06
Main Improvement: Better Units51:08
GRUs - 156:27
GRUs - 256:32
Attempt at a clean illustration01:00:26
GRU intuition - 101:01:26
GRU intuition - 201:04:01
RNN Translation Model Extensions - 301:09:42
Long-short-term-memories (LSTMs)01:10:54
Illustrations a bit overwhelming01:12:56
LSTMs are currently very hip01:13:54
Deep LSTM don't outperform traditional MT yet01:14:42
RNN Translation Model Extensions - 201:15:48
Deep LSTM for Machine Translation01:16:05
Further Improvements: More Gates!01:19:36
Summary01:20:46