en
0.25
0.5
0.75
1.25
1.5
1.75
2
Optimal Computational Trade-Off of Inexact Proximal Methods
Published on Jan 16, 20132686 Views
In this paper, we investigate the trade-off between convergence rate and computational cost when minimizing a composite functional with proximal-gradient methods, which are popular optimisation tools
Related categories
Chapter list
Optimal Computational Trade-Off of Inexact Proximal Methods00:00
Outline of the talk00:24
The Trade-Offs of Learning00:32
Minimizing the Risk (1)00:34
Minimizing the Risk (2)00:53
Minimizing the Risk (3)01:04
Minimizing the Risk (4)01:16
Minimizing the Risk (5)01:25
Excess Error Decomposition (1)01:39
Excess Error Decomposition (2)03:02
Excess Error Decomposition (3)03:31
Excess Error Decomposition (4)03:58
Excess Error Decomposition (5)04:22
Excess Error Decomposition (6)04:55
Consequences of this Trade-Off05:13
Inexact Proximal Methods06:17
Non-smooth convex optimization (1)06:24
Non-smooth convex optimization (2)06:43
Inexact Proximal Methods (1)07:30
Inexact Proximal Methods (2)08:10
Inexact Proximal Methods (3)08:47
Overview of the Algorithm09:16
Rates of convergence for inexact proximal methods (1)09:46
Rates of convergence for inexact proximal methods (2)10:51
Rates of convergence for inexact proximal methods (3)11:15
Main Contribution11:41
Defining and Optimizing the Cost (1)11:42
Defining and Optimizing the Cost (2)12:23
Precision and Number of Iterations (1)13:06
Precision and Number of Iterations (2)13:44
Optimal Strategy (1)14:05
Optimal Strategy (2)15:21
Optimal Strategy (3)15:28
Numerical Simulations16:40
Some simulations on a TV-reg deblurring problem16:46
Conclusion19:03
Conclusions and Future work (1)19:05
Conclusions and Future work (2)20:11
Conclusions and Future work (3)20:48