Reducing the Sampling Complexity of Topic Models thumbnail
Pause
Mute
Subtitles
Playback speed
0.25
0.5
0.75
1
1.25
1.5
1.75
2
Full screen

Reducing the Sampling Complexity of Topic Models

Published on Oct 07, 20143100 Views

Inference in topic models typically involves a sampling step to associate latent variables with observations. Unfortunately the generative model loses sparsity as the amount of data increases, requiri

Related categories

Chapter list

Reducing the Sampling Complexity of Topic Models00:00
Outline00:14
Topic Models00:42
Clustering & Topic Models00:45
Topics in text00:50
Collapsed Gibbs Sampler - 101:01
Collapsed Gibbs Sampler - 201:21
Exploiting Sparsity - 101:49
Exploiting Sparsity - 202:03
More Models - 102:21
More Models - 202:28
Key Idea of the Paper02:41
Metropolis Hastings Sampler03:23
Lazy decomposition - 103:32
Lazy decomposition - 203:50
Lazy decomposition - 304:01
Metropolis Hastings with stationary proposal distribution04:34
Application to Topic Models05:10
In a nutshell05:27
Alias Sampling05:42
Walker’s Alias Method05:48
Probability distribution - 106:18
Probability distribution - 206:25
Probability distribution - 306:30
Probability distribution - 406:45
Probability distribution - 506:51
Metropolis-Hastings-Walker07:19
Experiments08:06
LDA: Varying the number of topics (4k)08:25
LDA: Varying data size09:13
HDP & PDP09:25
Perplexity09:55
Summary10:09
And now in parallel10:59
Saving Nuclear Power Plants - 111:15
Saving Nuclear Power Plants - 211:30