The Dynamics of AdaBoost
published: Feb. 25, 2007, recorded: May 2005, views: 24684
Report a problem or upload filesIf you have found a problem with this lecture or would like to send us extra material, articles, exercises, etc., please use our ticket system to describe your request and upload the data.
Enter your e-mail into the 'Cc' field, and we will keep you updated with your request's status.
One of the most successful and popular learning algorithms is AdaBoost, which is a classification algorithm designed to construct a "strong" classifier from a "weak" learning algorithm. Just after the development of AdaBoost nine years ago, scientists derived margin- based generalization bounds to explain AdaBoost's unexpectedly good performance. Their result predicts that AdaBoost yields the best possible performance if it always achieves a "maximum margin" solution. Yet, does AdaBoost achieve a maximum margin solution? Empirical and theoretical studies conducted within this period conjecture the answer to be "yes". In order to answer this question, we look toward AdaBoost's dynamics. We simplify AdaBoost to reveal a nonlinear iterated map. We then analyze the convergence of AdaBoost for cases where cyclic behavior is found; this cyclic behavior provides the key to answering the question of whether AdaBoost always maximizes the margin. As it turns out, the answer to this question turns out to be the opposite of what was thought to be true! In this talk, I will introduce AdaBoost, describe our analysis of AdaBoost when viewed as a dynamical system, briefly mention a new boosting algorithm which always maximizes the margin with a fast convergence rate, and if time permits, I will reveal a surprising new result about AdaBoost and the problem of bipartite ranking.
Link this pageWould you like to put a link to this lecture on your homepage?
Go ahead! Copy the HTML snippet !