en-de
en-es
en-fr
en-pt
en-sl
en
en-zh
0.25
0.5
0.75
1.25
1.5
1.75
2
The Rate of Convergence of AdaBoost
Published on Aug 02, 20114676 Views
The AdaBoost algorithm was designed to combine many "weak" hypotheses that perform slightly better than random guessing into a "strong" hypothesis that has very low error. We study the rate at which
Related categories
Chapter list
The Rate of Convergence of AdaBoost00:00
AdaBoost (Freund and Schapire 97) - 100:16
AdaBoost (Freund and Schapire 97) - 200:31
Basic properties of AdaBoost’s convergence are still not fully understood 01:00
AdaBoost is known for its ability to combine “weak classifiers” into a “strong” classifier - 101:47
AdaBoost is known for its ability to combine “weak classifiers” into a “strong” classifier - 202:46
AdaBoost is known for its ability to combine “weak classifiers” into a “strong” classifier - 303:03
AdaBoost is known for its ability to combine “weak classifiers” into a “strong” classifier - 403:38
Known:04:04
Outline - 105:43
Main Messages06:35
Convergence Rate 107:33
At iteration t ... - 107:43
At iteration t ... - 208:21
At iteration t ... - 308:33
At iteration t ... - 408:47
At iteration t ... - 509:00
At iteration t ... - 609:11
At iteration t ... - 709:18
Theorem 1 - 109:25
Theorem 1 - 210:15
Theorem 1 - 310:53
Theorem 1 - 411:18
Theorem 1 - 511:34
Rate on a Simple Dataset (Log scale)12:01
Outline - 212:31
Theorem 2 - 112:36
Theorem 2 - 213:17
Theorem 2 - 313:37
Theorem 2 - 413:40
Theorem 2 - 513:56
Decomposition Lemma - 114:36
Decomposition Lemma - 215:31
Decomposition Lemma - 315:50
Decomposition Lemma - 416:09
Decomposition Lemma - 516:15
Decomposition Lemma - 616:20
Decomposition Lemma - 716:24
Lemma16:33
To summarize17:29