0.25
0.5
0.75
1.25
1.5
1.75
2
Should Model Architecture Reflect Linguistic Structure?
Published on May 27, 20167930 Views
Sequential recurrent neural networks (RNNs) over finite alphabets are remarkably effective models of natural language. RNNs now obtain language modeling results that substantially improve over long-st
Related categories
Chapter list
Should Neural Network Architecture Reflect Linguistic Structure?00:00
Learning language - 100:34
Learning language - 201:48
Learning language - 302:24
Learning language - 402:34
Learning language - 502:57
Learning language - 603:38
Compositional words - 105:25
Compositional words - 206:17
Example: Dependency parsing07:16
Dependency parsing CharLSTM > Word Lookup - 108:43
Dependency parsing CharLSTM > Word Lookup - 209:52
Language modeling: Word similarities11:57
Character vs. word modeling: Summary13:20
Structure-aware words - 113:54
Structure-aware words - 214:02
Open Vocabulary LMs14:37
Open Vocabulary LMs: Turkish morphology15:06
Input word representation16:50
Open Vocabulary LM17:25
Character vs. word modeling: Summary17:54
Modeling syntax18:57
Language is hierarchical20:36
One theory of hierarchy24:15
Terminals, Stack, Action26:28
Syntatic Composition - 128:31
Syntatic Composition - 229:42
Recursion - 130:11
Recursion - 230:15
Syntatic Composition - 330:28
Implementing RNNGs: Parameter Estimation30:42
Implementing RNNGs: Inference31:51
English PTB (Parsing)32:47
English PTB (LM and Chinese CTB (LM))33:29
This Talk, In a Nutshell33:59
Thanks34:23