A class of diagonal preconditioners for limited memory BFGS method
نویسندگان
چکیده
A major weakness of the limited memory BFGS (LBFGS) method is that it may converge very slowly on ill-conditioned problems when the identity matrix is used for initialization. Very often, the LBFGS method can adopt a preconditioner on the identity matrix to speed up the convergence. For this purpose, we propose a class of diagonal preconditioners to boost the performance of the LBFGS method. In this context, we find that it is appropriate to use a diagonal preconditioner, in the form of a diagonal matrix plus a positive multiple of the identity matrix, so as to fit information of local Hessian as well as to induce positive definiteness for the diagonal preconditioner at a whole. The property of hereditary positive definiteness is maintained by a careful choice of the positive scalar on the scaled identity matrix while the local curvature information is carried implicitly on the other diagonal matrix through the variational techniques, commonly employed in the derivation of quasi-Newton updates. Several preconditioning formulae are then derived and tested on a large set of standard test problems to access the impact of different choices of such preconditioners on the minimization performance.
منابع مشابه
Newton–Raphson preconditioner for Krylov type solvers on GPU devices
A new Newton-Raphson method based preconditioner for Krylov type linear equation solvers for GPGPU is developed, and the performance is investigated. Conventional preconditioners improve the convergence of Krylov type solvers, and perform well on CPUs. However, they do not perform well on GPGPUs, because of the complexity of implementing powerful preconditioners. The developed preconditioner is...
متن کاملSolving Limited-Memory BFGS Systems with Generalized Diagonal Updates
In this paper, we investigate a formula to solve systems of the form (Bk + D)x = y, where Bk comes from a limited-memory BFGS quasi-Newton method and D is a diagonal matrix with diagonal entries di,i ≥ σ for some σ > 0. These types of systems arise naturally in large-scale optimization. We show that provided a simple condition holds on B0 and σ, the system (Bk + D)x = y can be solved via a recu...
متن کاملLimited-Memory Reduced-Hessian Methods for Large-Scale Unconstrained Optimization
Limited-memory BFGS quasi-Newton methods approximate the Hessian matrix of second derivatives by the sum of a diagonal matrix and a fixed number of rank-one matrices. These methods are particularly effective for large problems in which the approximate Hessian cannot be stored explicitly. It can be shown that the conventional BFGS method accumulates approximate curvature in a sequence of expandi...
متن کاملTransformations enabling to construct limited-memory Broyden class methods
The Broyden class of quasi-Newton updates for inverse Hessian approximation are transformed to the formal BFGS update, which makes possible to generalize the well-known Nocedal method based on the Strang recurrences to the scaled limited-memory Broyden family, using the same number of stored vectors as for the limited-memory BFGS method. Two variants are given, the simpler of them does not requ...
متن کاملBFGS-like updates of constraint preconditioners for sequences of KKT linear systems
We focus on efficient preconditioning techniques for sequences of KKT linear systems arising from the interior point solution of large convex quadratic programming problems. Constraint Preconditioners (CPs), though very effective in accelerating Krylov methods in the solution of KKT systems, have a very high computational cost in some instances, because their factorization may be the most time-...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- Optimization Methods and Software
دوره 28 شماره
صفحات -
تاریخ انتشار 2013