Learning multi-scale temporal dynamics with recurrent neural networks thumbnail
Pause
Mute
Subtitles not available
Playback speed
0.25
0.5
0.75
1
1.25
1.5
1.75
2
Full screen

Learning multi-scale temporal dynamics with recurrent neural networks

Published on Mar 07, 20161945 Views

The last three years have seen an explosion of activity studying recurrent neural networks (RNNs), a generalization of feedforward neural networks which can map sequences to sequences. Training RNNs

Related categories

Chapter list

Learning multi-scale temporal dynamics with recurrent neural networks00:00
Collaborators00:18
Deep Learning Successes00:53
Untitled02:00
Recurrent Neural Networks - 103:31
Recurrent Neural Networks - 203:44
Recurrent Neural Networks - 304:04
Recurrent Neural Networks- 404:09
Recurrent Neural Networks - 504:23
Recurrent Neural Networks - 604:41
RNN (Update)05:15
RNNs: The Promise05:35
Generality06:15
RNNs: The Reality07:44
Solutions: Architectural - 108:39
Solutions: Architectural - 209:00
LSTM09:30
Solutions: Architectural - 312:09
Solutions: Architectural - 413:11
Solutions: Optimization13:39
Clockwork RNN (CW-RNN)15:19
CW-RNN16:04
CW-RNN (Update)17:25
CW-RNN (Example)18:55
CW-RNN: Problems20:45
RNNs: Shif-invariance21:45
Dense CW-RNN23:40
Dense CW-RNN (Example) - 125:10
Dense CW-RNN (Example) - 225:17
DCW-RNN: Shif-invariance26:45
Application27:16
Project Abacus (Google ATAP)28:00
Constraint: Embedded Processing28:59
Possible Solutions - 129:18
Possible Solutions - 229:34
Dynamic Data Representations29:52
Results: Project Abacus Data - 130:15
Results: Project Abacus Data - 230:48
Conclusion32:11
Thank You33:02