Theoretical neuroscience and deep learning theory thumbnail
slide-image
Pause
Mute
Subtitles not available
Playback speed
0.25
0.5
0.75
1
1.25
1.5
1.75
2
Full screen

Theoretical neuroscience and deep learning theory

Published on Aug 23, 20169497 Views

Both neuroscience and machine learning are experiencing a renaissance in which fundamental technological changes are driving qualitatively new phases of conceptual progress. In neuroscience, new metho

Related categories

Chapter list

Theoretical Neuroscience and Deep Learning Theory00:00
Theoretical neuroscience in the disciplinary landscape 00:15
Neural circuits and behavior: theory, computation and experiment01:25
Statistical mechanics of high dimensional data analysis/1 08:49
Statistical mechanics of high dimensional data analysis/213:16
Motivations for an alliance between theoretical neuroscience and theoretical machine learning 13:40
The shape of things to come… on monkeys and models/116:32
The shape of things to come… on monkeys and models/219:23
Some of the theoretical puzzles of deep learning21:28
A Mathematical Theory of Semantic Development23:25
What is “semantic cognition”? 23:53
Psychophysical tasks that probe semantic cognition24:50
The project that really keeps me up at night26:49
Semantic Cognition Phenomena27:16
A Network for Semantic Cognition 28:08
Categorical representations in human and monkey/230:23
Categorical representations in human and monkey/130:42
Evolution of internal representations 30:57
Evolution of internal representations 31:04
Theoretical questions31:20
Learning dynamics/1 32:57
Decomposing input-output correlations34:58
Analytical learning trajectory35:08
Learning dynamics/2 36:26
Problem formulation36:45
Origin of the rapid learning transition: saddle point dynamics in synaptic weight space37:31
Take home messages 38:19
Learning hierarchical structure 38:46
Connecting hierarchical generative models and neural network learning39:16
A hierarchical branching diffusion process40:11
Singular values 42:35
Progressive differentiation/142:53
Progressive differentiation/243:19
Progressive differentiation/343:31
Conclusion43:51
Other work44:46
Why are some properties distinctive, or learned faster?45:21
Why are some items more typical members of a category?45:25
How is inductive generalization achieved by neural networks? Inferring familiar properties of a novel item.45:27
How is inductive generalization achieved by neural networks? Inferring which familiar objects have a novel property. 45:29
What is a category and what makes it “coherent?”/145:31
What is a category and what makes it “coherent?”/246:28
What is a category and what makes it “coherent?”/447:25
What is a category and what makes it “coherent?”/547:31
What is a category and what makes it “coherent?”/352:16
Object analyzer vectors 54:12
What is a category and what makes it “coherent?”/654:24
What is a category and what makes it “coherent?”/754:46
What is a category and what makes it “coherent?”/855:54
Towards a theory of deep learning dynamics56:01
Nontrivial learning dynamics57:10
Depth-independent training time57:25
Random vs orthogonal58:15
Extensive Criticality yields Dynamical Isometry in nonlinear nets59:02
Dynamic Isometry Initialization59:51
Some of the theoretical puzzles of deep learning01:00:17
High dimensional nonconvex optimization01:00:26
General properties of error landscapes in high dimensions01:05:05
Properties of Error Landscapes on the Synaptic Weight Space of a Deep Neural Net01:05:10
How to descend saddle points01:05:39
Performance of saddle free Newton in learning deep neural networks.01:08:30
Some of the theoretical puzzles of deep learning01:08:48
A theory of deep neural expressivity through transient chaos01:09:28
Seminal works on the expressive power of depth/101:09:41
Seminal works on the expressive power of depth/201:10:52
Seminal works on the expressive power of depth/301:11:19
Questions01:11:56
Limitations of prior work01:12:55
Another perspective on the advantage of depth: disentangling01:13:38
A maximum entropy ensemble of deep random networks 01:15:53
Emergent, deterministic signal propagation in random neural networks01:17:15
Propagation of a manifold through a deep network/2 01:17:37
Propagation of a manifold through a deep network/101:17:48
Riemannian geometry I: Euclidean length01:19:17
Riemannian geometry II: Extrinsic Gaussian Curvature 01:20:44
Theory of curvature propagation in deep networks01:21:17
Curvature propagation: theory and experiment 01:22:39
Propagation of a manifold through a deep network/3 01:23:07
Boundary disentangling: theory 01:23:32
Boundary disentangling: experiment 01:24:50
References01:25:12
There are more things in heaven and earth…/101:25:22
There are more things in heaven and earth…/201:28:04
There are more things in heaven and earth…/301:29:57
Memory capacity with scalar analog synapses01:31:25
A frontier beyond whose bourn no curve can cross01:31:43
The dividends of understanding synaptic complexity01:32:09
A potential route to cognitive enhancement?01:32:15
Acknowledgements01:32:17
Summary - Thanks!01:32:20