## Efficiency of Quasi-Newton Methods on Strictly Positive Functions

author: Yurii Nesterov, Université catholique de Louvain
published: Jan. 13, 2011,   recorded: December 2010,   views: 384
Categories
You might be experiencing some problems with Your Video player.

# Slides

0:00 Slides Efficiency of Quasi-Newton methods on strictly positive functions Outline Standard approach: Absolute accuracy - 1 Standard approach: Absolute accuracy - 2 Standard approach: Absolute accuracy - 3 Standard approach: Absolute accuracy - 4 Standard approach: Absolute accuracy - 5 Standard approach: Absolute accuracy - 6 Standard approach: Absolute accuracy - 7 Relative accuracy (RA) - 1 Relative accuracy (RA) - 1 Relative accuracy (RA) - 3 Relative accuracy (RA) - 4 Relative accuracy (RA) - 5 Relative accuracy (RA) - 6 Relative accuracy (RA) - 7 Relative accuracy (RA) - 8 Relative accuracy (RA) - 9 Relative accuracy (RA) - 10 Barrier subgradient method [N.10] - 1 Barrier subgradient method [N.10] - 2 Barrier subgradient method [N.10] - 3 Barrier subgradient method [N.10] - 4 Strictly positive functions - 1 Strictly positive functions - 2 Strictly positive functions - 3 Strictly positive functions - 4 Strictly positive functions - 5 Strictly positive functions - 6 Strictly positive functions - 7 Simple examples - 1 Simple examples - 2 Simple examples - 3 Simple examples - 4 Simple examples - 5 Simple examples - 6 Simple examples - 7 Particular examples - 1 Particular examples - 2 General convex functions - 1 General convex functions - 2 General convex functions - 3 General convex functions - 4 General convex functions - 5 General convex functions - 6 Shifted general optimization problem - 1 Shifted general optimization problem - 2 Shifted general optimization problem - 3 Shifted general optimization problem - 4 Shifted general optimization problem - 5 Shifted general optimization problem - 6 Shifted general optimization problem - 7 Shifted general optimization problem - 8 Optimization problem with squared objective - 1 Optimization problem with squared objective - 2 Optimization problem with squared objective - 3 Optimization problem with squared objective - 4 Optimization problem with squared objective - 5 Optimization problem with squared objective - 6 Optimization problem with squared objective - 7 Optimization problem with squared objective - 8 Optimization problem with squared objective - 9 Optimization problem with squared objective - 10 Quasi-Newton Method - 1 Quasi-Newton Method - 2 Quasi-Newton Method - 3 Quasi-Newton Method - 4 Quasi-Newton Method - 5 Quasi-Newton Method - 6 Quasi-Newton Method - 7 Evolution of the Hessians - 1 Evolution of the Hessians - 2 Evolution of the Hessians - 3 Quasi-Newton Method - 7 Evolution of the Hessians - 4 Evolution of the Hessians - 5 Evolution of the Hessians - 6 Main Lemma - 1 Main Lemma - 2 Main Lemma - 3 Main Lemma - 4 Rate of convergence - 1 Rate of convergence - 2 Rate of convergence - 3 Rate of convergence - 4 Rate of convergence - 5 Rate of convergence - 6 Mixed accuracy - 1 Mixed accuracy - 2 Mixed accuracy - 3 Mixed accuracy - 4 Mixed accuracy - 5 Mixed accuracy - 6 Mixed accuracy - 7 Relative accuracy - 1 Relative accuracy - 2 Relative accuracy - 3 Relative accuracy - 4 Relative accuracy - 5 Relative accuracy - 6 Absolute accuracy - 1 Absolute accuracy - 2 Absolute accuracy - 3 Absolute accuracy - 4 Choice of parameters - 1 Choice of parameters - 2 Choice of parameters - 3 Choice of parameters - 4 Discussion - 1 Discussion - 2 Discussion - 3 Discussion - 4 Discussion - 5 Revival of old questions - 1 Revival of old questions - 2 Revival of old questions - 3 Revival of old questions - 4 Revival of old questions - 5 Revival of old questions - 6 Revival of old questions - 7 Revival of old questions - 8 Revival of old questions - 9 Revival of old questions - 10 Revival of old questions - 11 Revival of old questions - 12 Revival of old questions - 13 Revival of old questions - 14 Revival of old questions - 15 Revival of old questions - 16 Revival of old questions - 17

# Report a problem or upload files

If you have found a problem with this lecture or would like to send us extra material, articles, exercises, etc., please use our ticket system to describe your request and upload the data.
Enter your e-mail into the 'Cc' field, and we will keep you updated with your request's status.

# Description

In this talk we consider a new class of convex optimization problems, which admit faster black-box optimization schemes. For analyzing their rate of convergence, we introduce a notion of mixed accuracy of an approximate solution, which is a convenient generalization of the absolute and relative accuracies. We show that for our problem class, a natural Quasi-Newton method is always faster than the standard gradient method. At the same time, after an appropriate normalization, our results can be extended onto the general convex unconstrained minimization problems.