Frequency-aware Truncated methods for Sparse Online Learning thumbnail
Pause
Mute
Subtitles
Playback speed
0.25
0.5
0.75
1
1.25
1.5
1.75
2
Full screen

Frequency-aware Truncated methods for Sparse Online Learning

Published on Nov 29, 20112806 Views

Online supervised learning with L1-regularization has gained attention recently because it generally requires less computational time and a smaller space of complexity than batch-type learning methods

Related categories

Chapter list

Frequency-aware Truncated methodsfor Sparse Online Learning00:00
Problem Setting00:26
Notation01:05
Optimization Problem02:04
Online Learning02:53
Loss function03:59
Regularized term04:52
Additional Property of Lasso05:41
Previous Work06:39
Disadvantage of Previous Work07:29
Proposed method : Intention08:30
Proposed method (FT-FOBOS)09:18
Proposed method - 110:09
Proposed method - 210:36
Proposed method - 311:02
Algorithm of FT-FOBOS - 111:37
Algorithm of FT-FOBOS - 212:37
Algorithm of FT-FOBOS - 313:34
Theoretical Evaluation14:30
FT-FOBOS’s Regret15:42
Experimental Evaluations16:12
Experimental Results among FT-FOBOS16:35
Disparity when change16:49
Experimental Results among FT-FOBOS17:07
Experimental Results of all algorithms - 117:19
Experimental Results of all algorithms - 218:05
Summary18:42