en
0.25
0.5
0.75
1.25
1.5
1.75
2
Convergence rates of nested accelerated inexact proximal methods
Published on Jan 16, 20132975 Views
Proximal gradient methods are popular first order algorithms currently used to solve several machine learning and inverse problems. We consider the case where the proximity operator is not available i
Related categories
Chapter list
Convergence rates of nested accelerated inexact proximal methods00:00
Problem setting (1)00:09
Problem setting (2)00:24
Problem setting (3)00:27
Accelerated forward-backward splitting algorithm (1)00:39
Accelerated forward-backward splitting algorithm (2)01:25
Proximity operator of g (1)01:47
Proximity operator of g (2)02:03
Proximity operator of g (3)02:15
Main Results (1)02:27
Main Results (2)03:22
Convergence rate for accelerated inexact FB03:23
Main Results (3)04:46
Inexact computation of the proximity operator (1)04:54
Inexact computation of the proximity operator (2)05:15
Admissible approximations of g = ...05:34
Inexact proximal points of g ... (1)06:07
Inexact proximal points of g ... (2)06:36
Algorithms for computing inexact proximal points07:23
Main Results (4)08:21
Global convergence rate (1)08:38
Global convergence rate (2)08:59
Global convergence rate (3)09:27
Global convergence rate (4)09:45
Impact of the errors on the global iteration complexity (1)10:02
Impact of the errors on the global iteration complexity (2)11:47
Contributions12:29
References13:32