NLP and Deep Learning 2: Compositional Deep Learning	 thumbnail
slide-image
Pause
Mute
Subtitles not available
Playback speed
0.25
0.5
0.75
1
1.25
1.5
1.75
2
Full screen

NLP and Deep Learning 2: Compositional Deep Learning

Published on Sep 13, 20158982 Views

Related categories

Chapter list

NLP and Deep Learning 2: Compositional Deep Learning00:00
Compositionality00:05
Artificial Intelligence...00:23
We need more than word embeddings!02:11
Representing Phrases as Vectors02:37
How should we map phrases into a vector space?03:17
Equations04:29
Pictures04:57
Can we build meaning composition functions in deep learning systems?05:30
Conjecture05:45
A “generic” hierarchy on natural language doesn’t make sense?06:17
Strong priors? Universals of language?08:42
Universal 1810:27
What we want19:26
Experimental Results I19:44
Where does the tree structure come from? 19:45
Transition-based dependency parsers 20:59
Deep Learning Dependency Parser - 122:50
Deep Learning Dependency Parser - 223:15
Five attempts at meaning composition24:53
Tree Recursive Neural Networks25:43
Version 1: Simple concatenation Tree RNN28:01
Semantic similarity: nearest neighbors29:23
Version 1 Limitations29:24
Version 2: PCFG + Syntactically-Untied RNN31:29
Experiments33:32
SU-RNN / CVG - 134:36
SU-RNN / CVG - 235:44
Version 3: Matrix-vector RNNs - 138:13
Classification of Semantic Relationships - 140:36
Version 3: Matrix-vector RNNs - 242:15
Classification of Semantic Relationships - 243:45
Version 4: Recursive Neural Tensor Network44:01
Beyond the bag of words: Sentiment detection44:13
Stanford Sentiment Treebank47:34
Better Dataset Helped All Models48:26
Version 4: Recursive Neural Tensor Network49:39
Recursive Neural Tensor Network - 150:15
Recursive Neural Tensor Network - 250:45
Recursive Neural Tensor Network - 3450:55
Positive/Negative Results on Treebank - 151:16
Positive/Negative Results on Treebank - 251:36
Experimental Results on Treebank52:09
Negation Results52:44
A disappointment54:16
Deep Recursive Neural Networks for Compositionality in Language59:56
Version 5: Improving Deep Learning Semantic Representations using a TreeLSTM01:01:01
Long Short-Term Memory (LSTM) Units for Sequential Composition01:01:48
Tree-Structured Long Short-Term Memory Networks - 101:03:44
Tree-Structured Long Short-Term Memory Networks - 201:03:55
Tree-structured LSTM - 101:04:39
Tree-structured LSTM - 201:04:49
Results: Sentiment Analysis: Stanford Sentiment Treebank01:08:43
Results: Semantic Relatedness SICK 201401:15:29
Forget Gates: Selective State Preservation01:15:31
Tree structure helps01:16:25
Envoi01:24:31