en-es
en
0.25
0.5
0.75
1.25
1.5
1.75
2
Online gradient descent for LS regression: Non-asymptotic bounds and application to bandits
Published on Nov 07, 20132424 Views
We propose a stochastic gradient descent based method with randomization of samples for solving least squares regression. We consider a ""big data"" regime where both the dimension, d, of the data and
Related categories
Chapter list
Online gradient descent for least squares regression: Non-asymptotic bounds and application to bandits00:00
Motivation [1]00:13
Motivation [2]01:11
Outline03:03
Random online algorithm03:52
Error bound - 106:17
Application to bandits07:48
Pege algorithm with online gd09:29
Regret bound - 111:12
Adaptive regularization12:52
Error bound - 216:03
Confidence ball with online gd16:25
Regret bound - 217:45
Conclusions18:59