The First-Order View of Boosting Methods: Computational Complexity and Connections to Regularization thumbnail
Pause
Mute
Subtitles
Playback speed
0.25
0.5
0.75
1
1.25
1.5
1.75
2
Full screen

The First-Order View of Boosting Methods: Computational Complexity and Connections to Regularization

Published on Aug 26, 20133244 Views

Incremental Forward Stagewise Regression (FS") is a statistical algorithm that produces sparse coefficient profi les for linear regression. Using the tools of first-order methods in convex optimizati

Related categories

Chapter list

The First-Order view of Boosting Methods00:00
Motivation (1)00:37
Motivation (2)01:27
Overview02:04
Mirror Descent for Minmax Optimization (1)03:03
Mirror Descent for Minmax Optimization (2)03:44
Mirror Descent for Minmax Optimization (3)04:09
Mirror Descent for Minmax Optimization (4)04:46
Mirror Descent for Minmax Optimization (5)05:39
Mirror Descent for Minmax Optimization (6)06:30
Boosting and AdaBoost (1)07:21
Boosting and AdaBoost (2)08:03
Some Loss Functions for Boosting08:23
The Margin Maximization Problem09:21
AdaBoost Algorithm Description10:04
Optimization Perspectives on AdaBoost10:59
AdaBoost is Mirror Descent11:38
Interpretations12:09
Complexity - general case13:01
Complexity - separable case13:42
Complexity - non-separable case14:23
Conditional Gradient15:19
Structure (1)16:02
Structure (2)16:40
Complete16:58
Complexity of CG-Boost17:37
Incremental Forward Stagewise Regression (1)18:19
Incremental Forward Stagewise Regression (2)19:22
FS (1)20:05
FS (2)20:23
FS (3)20:38
FS (4)21:10
FS (5)21:26
Frank-Wolfe21:44
Subproblem solving22:02
Complete Frank-Wolfe22:20
Properties - FW-LASSO22:47
Complexity - FW-LASSO23:02
Conclusions23:43