Stochastic optimization with non-i.id. noise thumbnail
slide-image
Pause
Mute
Subtitles not available
Playback speed
0.25
0.5
0.75
1
1.25
1.5
1.75
2
Full screen

Stochastic optimization with non-i.id. noise

Published on Jan 25, 20123977 Views

We study the convergence of a class of stable online algorithms for stochastic convex optimization in settings where we do not receive independent samples from the distribution over which we optimize

Related categories

Chapter list

Stochastic optimization with non-i.i.d. noise00:00
Basic setup - 0100:12
Basic setup - 0201:12
Basic setup - 0301:24
Basic setup - 0401:56
Online Convex Optimization - 0102:28
Online Convex Optimization - 0203:40
I.I.D. sampling not always possible04:14
Formal setup - 0105:04
Formal setup - 0206:12
Example1: - 0107:27
Example1: - 0208:09
Example1: - 0308:47
Example 2:09:19
Stable online algorithms10:26
Convergence rate for convex losses - 0112:09
Convergence rate for convex losses - 0213:28
Specialization to particular algorithms14:02
Specialization to geometric mixing15:12
Simulation15:59
Example: - 0116:30
Example: - 0216:34
Example: - 0316:36
Example: - 0416:37
Convergence rate of MIGD16:44
Convergence plot for MIGD17:29
Strongly convex loss functions17:51
Better guarantees for strongly convex losses - 0118:20
Better guarantees for strongly convex losses - 0218:53
Fast rates for linear prediction - 0119:18
Fast rates for linear prediction - 0219:35
Conclusions20:19
Extensions20:57
References21:19