en
0.25
0.5
0.75
1.25
1.5
1.75
2
An Overview of Deep Learning and Its Challenges for Technical Computing
Published on Oct 13, 20147584 Views
Related categories
Chapter list
An overview of deep learning (and its challenges for technical computing) 00:00
Roadmap - 100:40
Roadmap - 200:41
Roadmap - 301:17
Roadmap - 401:27
Roadmap - 501:41
Learning Representations01:58
Deep Learning03:00
Deep Learning Timeline - 105:01
Deep Learning Timeline - 205:09
Deep Learning Timeline - 305:53
Deep Learning Timeline - 406:23
Deep Learning Timeline - 508:25
Example: Visual recognition - 108:59
Example: Visual recognition - 209:06
Example: Visual recognition - 309:09
Example: Visual recognition - 409:27
Feature Engineering09:55
What Limits Performance?10:11
Mid-level Representations12:11
Learning a Feature Hierarchy - 113:24
Learning a Feature Hierarchy - 214:13
Feature Hierarchies. So what?15:33
Feature Learning Paradigms17:09
Neural Networks (Introduction)17:40
Neural Networks for Supervised Learning18:41
Forward Propagation - 119:52
Forward Propagation - 220:45
Forward Propagation - 321:06
Alternative Graphical Representations21:19
How Good is a Network?22:21
Training24:21
Learning by Perturbing Weights25:01
The Idea behind Backpropagation26:44
Derivative w.r.t. Input of Softmax28:09
Backward Propagation - 229:27
Backward Propagation - 129:44
Backward Propagation - 329:47
Technical Challenge: Composition30:05
Tools for Building Neural Networks - 130:49
Tools for Building Neural Networks - 231:25
Caffe Example32:04
Caffe: Each layer defines… - 132:54
Caffe: Each layer defines… - 233:01
Caffe: Definition of a Net33:32
What about big nets?34:45
Technical Challenge: Computing Gradients35:40
Theano: teaser40:40
Technical Challenge: Optimization40:41
Technical Challenge: Hyperparameter Optimization41:46
Hyperparameter Optimization42:07
Hyperopt42:48
Spearmint44:21
Fully-Connected Layer45:03
Locally-Connected Layer - 146:17
Locally-Connected Layer - 246:59
Convolutional Layer - 147:20
Convolutional Layer - 248:07
Convolutional Layer - 348:54
Convolutional Layer - 449:27
Convolutional Layer - 550:10
Convolutional Layer - 650:26
Convolutional Net - Recap51:04
Pooling Layer - 151:46
Pooling Layer - 252:26
Types of Pooling52:41
Local Contrast Normalization - 153:11
Local Contrast Normalization - 253:46
Local Contrast Normalization - 353:54
Convnets: Single Stage54:29
Convnets: Typical Architecture55:16
Convnets: Training56:04
Convnets: Testing56:39
Convnets: today57:11
Technical Challenge: Scalability58:38
Convnets: why so successful now? - 159:38
Convnets: why so successful now? - 201:00:26
Tools: Scalability01:00:44
Motivation01:02:46
An Interesting Historical Fact01:03:49
Why Unsupervised Learning? - 101:04:52
Why Unsupervised Learning? - 201:05:27
Why Unsupervised Learning? - 301:06:01
Why Unsupervised Learning? - 401:07:52
Why Unsupervised Learning? - 501:09:08
Supervised Learning of Representations01:10:03
Unsupervised Learning of Representations - 101:10:20
Unsupervised Learning of Representations - 201:10:38
Principal Components Analysis01:12:18
An inefficient way to fit PCA01:13:41
Why fit PCA inefficiently?01:14:55
Auto-encoder - 101:16:09
Auto-encoder - 201:16:30
Regularized Auto-encoders01:16:32
Simple?01:16:46
Sparse Auto-encoders01:17:44
Denoising Auto-encoders01:18:43
Contractive Auto-encoders01:19:16
Stacking to Build Deep Models01:20:12
Stacking RBMs: Procedure - 101:20:53
Stacking RBMs: Procedure - 201:21:09
Stacking RBMs: Procedure - 301:21:27
Stacking RBMs: Procedure - 401:21:32
Deep Belief Networks01:21:52
Stacking RBMs: Intuition01:23:00
Conclusions and Challenges01:24:24