نتایج جستجو برای: bfgs method

تعداد نتایج: 1630302  

2017

This chapter presents the Nelder–Mead simplex method and the Broyden– Fletcher–Goldfarb–Shanno (BFGS) method for finite-element-model updating. The methods presented have been tested on a simple beam and an unsymmetrical H-shaped structure. It was noted that, on average, the Nelder–Mead simplex method gives more accurate results than did the BFGS method. This is mainly because the BFGS method r...

2001
Chengshan Xiao Jan C. Olivier Panajotis Agathoklis

A new method for designing IIR digital filters with linear phase in the passband is proposed. This method is based on frequency weighted least-square error optimization using the BFGS method [15]. The gradient of the cost function with respect to the design parameters, required for the implementation of the BFGS method, is derived. The proposed method is started by obtaining an initial IIR filt...

2012
Uwe Helmke M. Seibert

We discuss the BFGS method on Riemannian manifolds and put a special focus on the degrees of freedom which are offered by this generalization. Furthermore, we give an analysis of the BFGS method on Riemannian manifolds that are isometric toRn.

2012
ROUMMEL F. MARCIA

A MATLAB implementation of the Moré-Sorensen sequential (MSS) method is presented. The MSS method computes the minimizer of a quadratic function defined by a limited-memory BFGS matrix subject to a two-norm trust-region constraint. This solver is an adaptation of the Moré-Sorensen direct method into an L-BFGS setting for large-scale optimization. The MSS method makes use of a recently proposed ...

2014
ASRUL HERY BIN IBRAHIM MUSTAFA MAMAT

In this paper we present a new line search method known as the HBFGS method, which uses the search direction of the conjugate gradient method with the quasi-Newton updates. The Broyden-Fletcher-Goldfarb-Shanno (BFGS) update is used as approximation of the Hessian for the methods. The new algorithm is compared with the BFGS method in terms of iteration counts and CPU-time. Our numerical analysis...

2004
A. K. Alekseev I. M. Navon

We compare the performance of several robust large-scale minimization algorithms applied for the minimization of the cost functional in the solution of ill-posed inverse problems related to parameter estimation applied to the parabolized Navier-Stokes equations. The methods compared consist of the conjugate gradient method (CG), Quasi-Newton (BFGS), the limited memory Quasi-Newton (L-BFGS) [1],...

Journal: :Comp. Opt. and Appl. 2002
José Luis Morales Jorge Nocedal

This paper describes a class of optimization methods that interlace iterations of the limited memory BFGS method L BFGS and a Hessian free Newton method HFN in such a way that the information collected by one type of iteration improves the performance of the other Curvature information about the objective function is stored in the form of a limited memory matrix and plays the dual role of preco...

Journal: :SIAM Journal on Optimization 2002
Yu-Hong Dai

The BFGS method is one of the most famous quasi-Newton algorithms for unconstrained optimization. In 1984, Powell presented an example of a function of two variables that shows that the Polak–Ribière–Polyak (PRP) conjugate gradient method and the BFGS quasi-Newton method may cycle around eight nonstationary points if each line search picks a local minimum that provides a reduction in the object...

Journal: :Applied Mathematics and Computation 2012
Jan Vlcek Ladislav Luksan

Simple modifications of the limited-memory BFGS method (L-BFGS) for large scale unconstrained optimization are considered, which consist in corrections (derived from the idea of conjugate directions) of the used difference vectors, utilizing information from the preceding iteration. In case of quadratic objective functions, the improvement of convergence is the best one in some sense and all st...

Journal: :SIAM Journal on Optimization 2020

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید