نتایج جستجو برای: bfgs method

تعداد نتایج: 1630302  

2015
Yong Li Gonglin Yuan Zengxin Wei

In this paper, a trust-region algorithm is proposed for large-scale nonlinear equations, where the limited-memory BFGS (L-M-BFGS) update matrix is used in the trust-region subproblem to improve the effectiveness of the algorithm for large-scale problems. The global convergence of the presented method is established under suitable conditions. The numerical results of the test problems show that ...

2010
D. F. Shanno D. F. SHANNO

The relationship between variable-metric methods derived by norm minimization and those derived by symmetrization of rank-one updates for sparse systems is studied, and an analogue of Dennis's nonsparse symmetrization formula derived. A new method of using norm minimization to produce a sparse analogue of any nonsparse variable-metric method is proposed. The sparse BFGS generated by this method...

Journal: :Optimization Methods and Software 2009
A. K. Alekseev Ionel Michael Navon J. L. Steward

We compare the performance of several robust large-scale minimization algorithms for the unconstrained minimization of an ill-posed inverse problem. The parabolized Navier-Stokes equations model was used for adjoint parameter estimation. The methods compared consist of two versions of the nonlinear conjugate gradient method (CG), Quasi-Newton (BFGS), the limited memory Quasi-Newton (L-BFGS) [15...

1995
Xiaojun Chen

This paper proposes a BFGS-SQP method for linearly constrained optimization where the objective function f is only required to have a Lipschitz gradient. The KKT system of the problem is equivalent to a system of nonsmooth equations F(v) = 0. At every step a quasi-Newton matrix is updated if kF(v k)k satisses a rule. This method converges globally and the rate of convergence is su-perlinear whe...

2010
Gonglin Yuan Zengxin Wei Yanlin Wu YANLIN WU

In this paper, we propose two limited memory BFGS algorithms with a nonmonotone line search technique for unconstrained optimization problems. The global convergence of the given methods will be established under suitable conditions. Numerical results show that the presented algorithms are more competitive than the normal BFGS method.

2008
J. Vlček L. Lukšan

The Broyden class of quasi-Newton updates for inverse Hessian approximation are transformed to the formal BFGS update, which makes possible to generalize the well-known Nocedal method based on the Strang recurrences to the scaled limited-memory Broyden family, using the same number of stored vectors as for the limited-memory BFGS method. Two variants are given, the simpler of them does not requ...

Journal: :Journal of Computational and Applied Mathematics 2009

1998
Xiaojun Chen

This paper studies convergence analysis of a preconditioned inexact Uzawa method for nondi erentiable saddle-point problems. The SOR-Newton method and the SOR-BFGS method are special cases of this method. We relax the Bramble– Pasciak–Vassilev condition on preconditioners for convergence of the inexact Uzawa method for linear saddle-point problems. The relaxed condition is used to determine the...

1996
JAMES V. BURKE ANDREAS WIEGMANN LIANG XU

The limited memory BFGS method pioneered by Jorge Nocedal is usually implemented as a line search method where the search direction is computed from a BFGS approximation to the inverse of the Hessian. The advantage of inverse updating is that the search directions are obtained by a matrix–vector multiplication. In this paper it is observed that limited memory updates to the Hessian approximatio...

1996
Richard H Byrd Jorge Nocedal Ciyou Zhu

A new method for solving large nonlinear optimization problems is outlined It attempts to combine the best properties of the discrete truncated Newton method and the limited memory BFGS method to produce an algorithm that is both economical and capable of handling ill conditioned problems The key idea is to use the curvature information generated during the computation of the discrete Newton st...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید