Unbiased Offline Evaluation of Contextual-bandit-based News Article Recommendation Algorithms thumbnail
Pause
Mute
Subtitles
Playback speed
0.25
0.5
0.75
1
1.25
1.5
1.75
2
Full screen

Unbiased Offline Evaluation of Contextual-bandit-based News Article Recommendation Algorithms

Published on Aug 09, 20113765 Views

Contextual bandit algorithms have become popular for online recommendation systems such as Digg, Yahoo! Buzz, and news recommendation in general. Offline evaluation of the effectiveness of new algorit

Related categories