Learning Structural Correspondences Across Different Linguistic Domains with Synchronous Neural Language Models thumbnail
slide-image
Pause
Mute
Subtitles not available
Playback speed
0.25
0.5
0.75
1
1.25
1.5
1.75
2
Full screen

Learning Structural Correspondences Across Different Linguistic Domains with Synchronous Neural Language Models

Published on Jan 11, 20133048 Views

We introduce a novel framework for learning structural correspondences between two linguistic domains based on training synchronous neural language models with co-regularization on both domains simult

Related categories

Chapter list

Learning Structural Correspondences Using Synchronous Neural Language Models00:00
Transfer across domains.. (1)00:16
Transfer across domains.. (2)00:49
Domain adaptation (1)00:59
Domain adaptation (2)01:20
Domain adaptation (3)01:35
Part-of-Speech Tagging (1)01:46
Part-of-Speech Tagging (2)02:27
Part-of-Speech Tagging (3)02:29
Part-of-Speech Tagging (4)02:37
nPart-of-Speech Tagging (5)02:38
Part-of-Speech Tagging (6)02:41
Structural Correspondence Learning03:00
This Work: Deep Structural Correspondence Learning (1)03:41
This Work: Deep Structural Correspondence Learning (2)04:27
This Work: Deep Structural Correspondence Learning (3)04:39
This Work: Deep Structural Correspondence Learning (4)05:01
The Log Bilinear NLM05:39
Augmented cost function (1)06:31
Augmented cost function (2)06:48
Augmented cost function (3)06:59
Constraining the Embeddings07:01
Augmented cost function07:22
Constraining the learned Functions07:35
Experiments I: Synthetic Data (1)07:44
Experiments I: Synthetic Data (2)08:23
Experiments II: English-French (1)09:27
Experiments II: English-French (2)10:44
Experiments II: English-French (3)11:17
Conclusion11:36
Thank you!12:29