0.25
0.5
0.75
1.25
1.5
1.75
2
Distributional semantics and topic modeling: theory and application
Published on Nov 18, 201928 Views
In recent years, methods of text analysis based on the paradigm of Distributional Semantics have become hugely popular not only in the Digital Humanities. This workshop will first introduce the partic
Related categories
Chapter list
Distributional Semantics and Topic Modeling: Theory and Application00:00
About this workshop - 100:25
About this workshop - 201:20
About this workshop - 301:53
About this workshop - 402:36
About myself03:03
About you - 104:16
About you - 204:36
About you - 304:43
About you - 404:52
About you - 505:02
About you - 605:08
About you - 705:18
About you - 805:32
Overview08:22
Distributional Semantics: Principles and Methods10:58
Basic intuition about distributional semantics - 111:01
Basic intuition about distributional semantics - 211:32
Basic intuition about distributional semantics - 311:36
Basic intuition about distributional semantics - 411:44
Basic intuition about distributional semantics - 511:56
What does this example tell us? - 113:10
Basic intuition about distributional semantics - 613:34
What does this example tell us?14:32
Basic idea - 114:40
Basic idea - 215:26
Two applications of this idea15:30
What are Word Embeddings?15:37
Information Retrieval: Vector Space Model - 115:59
Information Retrieval: Vector Space Model - 216:20
Information Retrieval: Vector Space Model - 316:54
Words in vector space19:35
Example:French Wikipedia model21:31
Similar Words Query23:44
Similarity Query25:32
Evaluation - 126:40
Evaluation - 226:51
Evaluation - 327:04
Evaluation - 427:56
Axes of meaning - 229:04
Axes of meaning - 329:18
Axes of meaning - 130:19
Axis query30:30
Time for questions! - 131:34
What is Topic modeling?35:08
Topic modeling: basic idea - 135:11
Topic modeling: basic idea - 235:35
Topic modeling: basic idea - 336:43
Topic modeling: basic idea - 438:32
Usage scenarios - 138:53
Usage scenarios - 239:04
Usage scenarios - 339:12
Usage scenarios - 439:30
Usage scenarios - 539:52
Explorative Visualization - 140:16
Explorative Visualization - 341:18
Explorative Visualization - 441:20
Explorative Visualization - 541:56
Explorative Visualization - 642:30
Explorative Visualization - 744:38
Explorative Visualization - 844:44
Explorative Visualization - 245:01
Existing Studies - 145:28
Existing Studies - 246:08
Existing Studies - 346:14
Existing Studies - 446:21
Existing Studies - 546:54
Existing Studies - 647:02
Existing Studies - 747:19
A topic model for French crime fiction47:24
Text collection47:31
Crime fiction - 148:04
Crime fiction - 248:08
Crime fiction - 348:15
Crime fiction - 448:23
Crime fiction - 548:39
Crime fiction - 648:45
Topic and subgenre - 150:52
Topic and subgenre - 251:24
Topic and subgenre - 351:37
Topic and subgenre - 452:50
Topic over text segments - 255:16
Overall results - 156:18
Overall results - 256:26
Overall results - 356:34
Overall results - 458:24
Topic over text segments - 101:00:41
Time for questions! - 201:01:20
Topic Modeling:Theory01:03:03
How does a topic model look like?01:03:08
On a practical level - 101:03:14
On a practical level - 201:04:04
On a practical level - 301:04:22
On a practical level - 401:05:04
On a practical level - 501:05:15
On a practical level - 601:05:44
On a practical level - 701:06:57
Dirichlet distributions01:08:04
Words in topic distribution01:12:20
Topics in document distribution01:13:14
How is a Topic Model created?01:13:51
Some relevant ideas - 101:13:57
Some relevant ideas - 201:14:11
Some relevant ideas - 301:14:37
Some relevant ideas - 401:15:21
Some relevant ideas - 501:15:25
Some relevant ideas - 601:15:46
Some relevant ideas - 701:15:47
Generative,inverted,iterative01:15:58
Inference problem:observed data01:17:15
Inferred,latent model01:17:19
The starting point of LDA - 101:20:15
The starting point of LDA - 201:20:40
The starting point of LDA - 301:20:58
The starting point of LDA - 401:22:05
The generative model behind LDA01:22:07
Random initialization - 101:22:11
Random initialization - 201:22:17
Random initialization - 301:22:21
Random initialization - 401:22:44
Inference: iterative approximation - 101:24:08
Inference: iterative approximation - 201:24:22
Inference: iterative approximation - 301:24:28
How it works exactly,clearly explained01:25:00
Time for questions - 301:26:10