نتایج جستجو برای: BFGS method

تعداد نتایج: 1630302  

1994
Aiping Liao

In this paper, we propose a modiied BFGS method and study the global and superlinear convergence properties of this method. We show that under certain circumstances this modiied BFGS method corrects the eigenvalues better than the BFGS does. Our numerical results support this claim and also indicate that the modiied BFGS method may be competitive with the BFGS method in general. This modiied me...

2016
WENBO GAO DONALD GOLDFARB

We introduce a quasi-Newton method with block updates called Block BFGS. We show that this method, performed with inexact Armijo-Wolfe line searches, converges globally and superlinearly under the same convexity assumptions as BFGS. We also show that Block BFGS is globally convergent to a stationary point when applied to non-convex functions with bounded Hessian, and discuss other modifications...

1989
Dong C. Liu

We study the numerical performance of a limited memory quasi-Newton method for large scale optimization, which we call the L-BFGS method. We compare its performance with that of the method developed by Buckley and LeNir (1985), which combines cyles of BFGS steps and conjugate direction steps. Our numerical tests indicate that the L-BFGS method is faster than the method of Buckley and LeNir, and...

2001
M. Al-Baali

This paper studies recent modi cations of the limited memory BFGS (L-BFGS) method for solving large scale unconstrained optimization problems. Each modi cation technique attempts to improve the quality of the L-BFGS Hessian by employing (extra) updates in certain sense. Because at some iterations these updates might be redundant or worsen the quality of this Hessian, this paper proposes an upda...

2014
Shinji SUGIMOTO Nobuo YAMASHITA

The limited-memory BFGS (L-BFGS) algorithm is a popular method of solving large-scale unconstrained minimization problems. Since LBFGS conducts a line search with the Wolfe condition, it may require many function evaluations for ill-posed problems. To overcome this difficulty, we propose a method that combines L-BFGS with the regularized Newton method. The computational cost for a single iterat...

Journal: :Math. Program. 1989
Dong C. Liu Jorge Nocedal

We study the numerical performance of a limited memory quasi Newton method for large scale optimization which we call the L BFGS method We compare its performance with that of the method developed by Buckley and LeNir which combines cyles of BFGS steps and conjugate direction steps Our numerical tests indicate that the L BFGS method is faster than the method of Buckley and LeNir and is better a...

2013
A. S. Lewis S. Zhang

This paper investigates the potential behavior, both good and bad, of the well-known BFGS algorithm for smooth minimization, when applied to nonsmooth functions. We consider three very particular examples. We first present a simple nonsmooth example, illustrating how BFGS (in this case with an exact line search) typically succeeds despite nonsmoothness. We then study, computationally, the behav...

Journal: :J. Complexity 2002
Mehiddin Al-Baali

This paper studies recent modifications of the limited memory BFGS (L-BFGS) method for solving large scale unconstrained optimization problems. Each modification technique attempts to improve the quality of the L-BFGS Hessian by employing (extra) updates in a certain sense. Because at some iterations these updates might be redundant or worsen the quality of this Hessian, this paper proposes an ...

2012
R. Jaafar M. Mamat

The BFGS methods is a method to solve an unconstrained optimization. Many modification have been done for solving this problems. In this paper, we present a new scaled hybrid modified BFGS. The new scaled hybrid modified BFGS algorithms are proposed and analyzed. The scaled hybrid modified BFGS can improve the number of iterations. Results obtained by the hybrid modified BFGS algorithms are com...

Journal: :J. Optimization Theory and Applications 2014
Mehiddin Al-Baali Lucio Grandinetti Ornella Pisacane

This paper is aimed to extend a certain damped technique, suitable for the Broyden–Fletcher–Goldfarb–Shanno (BFGS) method, to the limited memory BFGS method in the case of the large-scale unconstrained optimization. It is shown that the proposed technique maintains the global convergence property on uniformly convex functions for the limited memory BFGS method. Some numerical results are descri...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید