Memory, Reading, and Comprehension thumbnail
Pause
Mute
Subtitles not available
Playback speed
0.25
0.5
0.75
1
1.25
1.5
1.75
2
Full screen

Memory, Reading, and Comprehension

Published on Sep 13, 20157218 Views

Related categories

Chapter list

Memory, Reading, and Comprehension00:00
Deep Learning and NLP: Question Answer Selection - 104:33
Deep Learning and NLP: Question Answer Selection - 206:18
Deep Learning for NLP: Machine Translation - 106:35
Deep Learning for NLP: Machine Translation - 206:50
NLP at Google DeepMind07:46
Outline - 110:04
Transduction and RNNs - 110:32
Transduction and RNNs - 212:42
Transduction and RNNs - 313:32
Transduction and RNNs - 413:58
Learning to Exectue14:18
Unbounded Neural Memory14:50
Example: A Continuous Stack - 116:40
Example: A Continuous Stack - 217:43
Example: A Continuous Stack - 318:31
Example: A Continuous Stack - 419:45
Example: Controlling a Neural Stack20:28
Synthetic Transduction Tasks21:23
Synthetic ITG Transduction Tasks22:48
Coarse and Fine Grained Accuracy24:38
Results25:04
Rapid Convergence27:51
Outline29:18
Supervised Reading Comprehension29:59
Supervised Reading Comprehension: MCTest33:09
Supervised Reading Comprehension: FB Synthetic34:30
Supervised Reading Comprehension - 136:04
Supervised Reading Comprehension - 239:12
Supervised Reading Comprehension - 340:04
Supervised Reading Comprehension - 441:03
Supervised Reading Comprehension - 541:57
Data Set Statistics44:28
Question difficulty48:22
Frequency baselines (Accuracy)52:25
Frame semantic matching - 153:36
Frame semantic matching - 255:31
Word distance benchmark - 156:17
Word distance benchmark - 257:18
Word distance benchmark - 357:50
Reading via Encoding58:18
Deep LSTM Reader - 159:26
Deep LSTM Reader - 201:00:56
Deep LSTM Reader - 301:01:13
Deep LSTM Reader - 401:02:22
The Attentive Reader - 101:05:26
The Attentive Reader - 201:05:46
The Attentive Reader - 301:05:47
Attentive Reader Training01:06:31
The Attentive Reader: Predicted: ent49, Correct: ent4901:11:39
The Attentive Reader: Predicted: ent27, Correct: ent2701:13:39
The Attentive Reader: Predicted: ent85, Correct: ent3701:13:46
The Attentive Reader: Predicted: ent24, Correct: ent201:14:56
The Impatient Reader - 101:15:30
The Impatient Reader - 201:16:29
The Impatient Reader - 301:16:47
Attention Models Precision@Recall01:17:25
Conclusion01:20:13
Google DeepMind and Oxford University01:22:30