نتایج جستجو برای: limited memory bfgs

تعداد نتایج: 672103  

2004
A. K. Alekseev I. M. Navon

We compare the performance of several robust large-scale minimization algorithms applied for the minimization of the cost functional in the solution of ill-posed inverse problems related to parameter estimation applied to the parabolized Navier-Stokes equations. The methods compared consist of the conjugate gradient method (CG), Quasi-Newton (BFGS), the limited memory Quasi-Newton (L-BFGS) [1],...

2016
Noriyuki Kushida

A new Newton-Raphson method based preconditioner for Krylov type linear equation solvers for GPGPU is developed, and the performance is investigated. Conventional preconditioners improve the convergence of Krylov type solvers, and perform well on CPUs. However, they do not perform well on GPGPUs, because of the complexity of implementing powerful preconditioners. The developed preconditioner is...

Journal: :Evolutionary computation 2017
Ilya Loshchilov

Limited-memory BFGS (L-BFGS; Liu and Nocedal, 1989 ) is often considered to be the method of choice for continuous optimization when first- or second-order information is available. However, the use of L-BFGS can be complicated in a black box scenario where gradient information is not available and therefore should be numerically estimated. The accuracy of this estimation, obtained by finite di...

2012
ROUMMEL F. MARCIA

A MATLAB implementation of the Moré-Sorensen sequential (MSS) method is presented. The MSS method computes the minimizer of a quadratic function defined by a limited-memory BFGS matrix subject to a two-norm trust-region constraint. This solver is an adaptation of the Moré-Sorensen direct method into an L-BFGS setting for large-scale optimization. The MSS method makes use of a recently proposed ...

2013
Gonglin Yuan Zengxin Wei Yong Li

In this paper, a trust-region algorithm combining with the limited memory BFGS (L-BFGS) update is proposed for solving nonlinear equations, where the super relaxation technique(SRT) is used. We choose the next iteration point by SRT. The global convergence without the nondegeneracy assumption is obtained under suitable conditions. Numerical results show that this method is very effective for la...

2008

We extend the well-known BFGS quasiNewton method and its limited-memory variant (LBFGS) to the optimization of nonsmooth convex objectives. This is done in a rigorous fashion by generalizing three components of BFGS to subdifferentials: The local quadratic model, the identification of a descent direction, and the Wolfe line search conditions. We apply the resulting sub(L)BFGS algorithm to L2-re...

Journal: :Applied Mathematics and Computation 2012
Jan Vlcek Ladislav Luksan

Simple modifications of the limited-memory BFGS method (L-BFGS) for large scale unconstrained optimization are considered, which consist in corrections (derived from the idea of conjugate directions) of the used difference vectors, utilizing information from the preceding iteration. In case of quadratic objective functions, the improvement of convergence is the best one in some sense and all st...

1989
Dong C. Liu

We study the numerical performance of a limited memory quasi-Newton method for large scale optimization, which we call the L-BFGS method. We compare its performance with that of the method developed by Buckley and LeNir (1985), which combines cyles of BFGS steps and conjugate direction steps. Our numerical tests indicate that the L-BFGS method is faster than the method of Buckley and LeNir, and...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید