1 Billion Instances, 1 Thousand Machines and 3.5 Hours thumbnail
Pause
Mute
Subtitles
Playback speed
0.25
0.5
0.75
1
1.25
1.5
1.75
2
Full screen

1 Billion Instances, 1 Thousand Machines and 3.5 Hours

Published on Jan 19, 20104100 Views

Training conditional maximum entropy models on massive data sets requires significant computational resources, but by distributing the computation, training time can be significant reduced. Recent the

Related categories