Unbiased Offline Evaluation of Contextual-bandit-based News Article Recommendation Algorithms
Published on Aug 09, 20113767 Views
Contextual bandit algorithms have become popular for online recommendation systems such as Digg, Yahoo! Buzz, and news recommendation in general. Offline evaluation of the effectiveness of new algorit