نتایج جستجو برای: Limited-Memory BFGS

تعداد نتایج: 672103  

Journal: :Applied Mathematics Letters 2002

Journal: :J. Optimization Theory and Applications 2014
Mehiddin Al-Baali Lucio Grandinetti Ornella Pisacane

This paper is aimed to extend a certain damped technique, suitable for the Broyden–Fletcher–Goldfarb–Shanno (BFGS) method, to the limited memory BFGS method in the case of the large-scale unconstrained optimization. It is shown that the proposed technique maintains the global convergence property on uniformly convex functions for the limited memory BFGS method. Some numerical results are descri...

1996
JAMES V. BURKE ANDREAS WIEGMANN LIANG XU

The limited memory BFGS method pioneered by Jorge Nocedal is usually implemented as a line search method where the search direction is computed from a BFGS approximation to the inverse of the Hessian. The advantage of inverse updating is that the search directions are obtained by a matrix–vector multiplication. In this paper it is observed that limited memory updates to the Hessian approximatio...

Journal: :SIAM Journal on Optimization 2003
Philip E. Gill Michael W. Leonard

Limited-memory BFGS quasi-Newton methods approximate the Hessian matrix of second derivatives by the sum of a diagonal matrix and a fixed number of rank-one matrices. These methods are particularly effective for large problems in which the approximate Hessian cannot be stored explicitly. It can be shown that the conventional BFGS method accumulates approximate curvature in a sequence of expandi...

Journal: :SIAM Journal on Optimization 1998
Tamara G. Kolda Dianne P. O'Leary Larry Nazareth

We give conditions under which limited-memory quasi-Newton methods with exact line searches will terminate in n steps when minimizing n-dimensional quadratic functions. We show that although all Broyden family methods terminate in n steps in their full-memory versions, only BFGS does so with limited-memory. Additionally, we show that full-memory Broyden family methods with exact line searches t...

Journal: :Journal of Machine Learning Research 2015
Aryan Mokhtari Alejandro Ribeiro

Global convergence of an online (stochastic) limited memory version of the Broyden-FletcherGoldfarb-Shanno (BFGS) quasi-Newton method for solving optimization problems with stochastic objectives that arise in large scale machine learning is established. Lower and upper bounds on the Hessian eigenvalues of the sample functions are shown to suffice to guarantee that the curvature approximation ma...

2001
Luis Morales

The application of quasi-Newton methods is widespread in numerical optimization. Independently of the application, the techniques used to update the BFGS matrices seem to play an important role in the performance of the overall method. In this paper we address precisely this issue. We compare two implementations of the limited memory BFGS method for large-scale unconstrained problems. They diie...

2005
Leong Wah Malik Abu HASSAN M. A. HASSAN

In this paper we present two new numerical methods for unconstrained large-scale optimization. These methods apply update formulae, which are derived by considering different techniques of approximating the objective function. Theoretical analysis is given to show the advantages of using these update formulae. It is observed that these update formulae can be employed within the framework of lim...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید