The Dynamics of AdaBoost thumbnail
Pause
Mute
Subtitles
Playback speed
0.25
0.5
0.75
1
1.25
1.5
1.75
2
Full screen

The Dynamics of AdaBoost

Published on Feb 25, 200724719 Views

One of the most successful and popular learning algorithms is AdaBoost, which is a classification algorithm designed to construct a "strong" classifier from a "weak" learning algorithm. Just after the

Related categories

Chapter list

Dynamics of AdaBoost00:02
A Story about AdaBoost00:12
The question remained (until recently): Does AdaBoost maximize the margin?02:08
The question remained (until recently): Does AdaBoost maximize the margin?02:34
The question remained (until recently): Does AdaBoost maximize the margin?03:00
The question remained (until recently): Does AdaBoost maximize the margin?03:29
The question remained (until recently): Does AdaBoost maximize the margin?04:17
The question remained (until recently): Does AdaBoost maximize the margin?04:42
The question remained (until recently): Does AdaBoost maximize the margin?05:19
 Overview of Talk 06:39
 A Sample Problem 07:32
Say you have a database of news articles…07:34
Examples of Classification Tasks:08:30
Examples of classification algorithms:08:55
Training Data: {(xi,yi)}i=1..m where (xi,yi) is chosen iid from an unknown probability distribution on X{-1,1}.09:15
How do we construct a classifier?09:31
Say we have a “weak” learning algorithm:09:43
Boosting algorithms combine weak classifiers in a meaningful way (Schapire ‘89).10:03
AdaBoost (Freund and Schapire ’96)11:14
AdaBoost12:57
AdaBoost14:45
AdaBoost15:06
AdaBoost15:33
AdaBoost17:33
Does AdaBoost choose λfinal so that the margin µ( f ) is maximized? That is, does AdaBoost maximize the margin? No! 18:17
The question remained (until recently): Does AdaBoost maximize the margin?19:08
About the proof…20:11
 Analyzing AdaBoost using Dynamical Systems 20:28
Smallest Non-Trivial Case21:13
TITLE22:24
Smallest Non-Trivial Case23:44
Smallest Non-Trivial Case24:05
Smallest Non-Trivial Case24:09
Smallest Non-Trivial Case24:11
Smallest Non-Trivial Case24:14
Smallest Non-Trivial Case24:24
Smallest Non-Trivial Case24:44
Two possible stable cycles!24:57
Generalization of smallest non-trivial case27:11
Generalization of smallest non-trivial case28:00
 Empirically Observed Cycles 29:08
 Empirically Observed Cycles 29:10
 Empirically Observed Cycles 31:12
 Empirically Observed Cycles 31:32
 Empirically Observed Cycles 31:36
 Empirically Observed Cycles 31:39
 Empirically Observed Cycles 31:41
If AdaBoost cycles, we can calculate the margin it will asymptotically converge to in terms of the edge values 31:43
The question remained (until recently): Does AdaBoost maximize the margin?32:16