0.25
0.5
0.75
1.25
1.5
1.75
2
BlackOut: Speeding up Recurrent Neural Network Language Models With Very Large Vocabularies
Published on May 27, 20162115 Views
We propose BlackOut, an approximation algorithm to efficiently train massive recurrent neural network language models (RNNLMs) with million word vocabularies. BlackOut is motivated by using a discrimi
Related categories
Chapter list
BlackOut: Speeding up RNNLMs with Very Large Vocabularies00:00
Prevalence of Large Softmax Output Layers00:22
Case Study: RNNLM01:35
System Optimization 04:50
Strategies to Speed up Softmax05:55
Blackout Training06:51
Connection to Importance Sampling09:28
Connection to Noise Constrastive Estimate (NCE)10:45
Comparison to Dropout12:04
Experiments on Small Datasets - 113:34
Experiments on Small Datasets - 214:34
Experiments on 1-Billion Word Benchmark14:41
Comparison to State-of-The-Arts15:15
Conclusion16:33