نتایج جستجو برای: limited memory bfgs

تعداد نتایج: 672103  

2012
Jennifer Erway

In this paper, we investigate a formula to solve systems of the form (Bk + D)x = y, where Bk comes from a limited-memory BFGS quasi-Newton method and D is a diagonal matrix with diagonal entries di,i ≥ σ for some σ > 0. These types of systems arise naturally in large-scale optimization. We show that provided a simple condition holds on B0 and σ, the system (Bk + D)x = y can be solved via a recu...

2015
Yong Li Gonglin Yuan Zengxin Wei

In this paper, a trust-region algorithm is proposed for large-scale nonlinear equations, where the limited-memory BFGS (L-M-BFGS) update matrix is used in the trust-region subproblem to improve the effectiveness of the algorithm for large-scale problems. The global convergence of the presented method is established under suitable conditions. The numerical results of the test problems show that ...

Journal: :Geophysical Journal International 2021

SUMMARY Full-waveform inversion has become an essential technique for mapping geophysical subsurface structures. However, proper uncertainty quantification is often lacking in current applications. In theory, related to the inverse Hessian (or posterior covariance matrix). Even common problems its calculation beyond computational and storage capacities of largest high-performance computing syst...

1996
JAMES V. BURKE ANDREAS WIEGMANN

The limited memory BFGS method pioneered by Jorge Nocedal is usually implemented as a line search method where the search direction is computed from a BFGS approximation to the inverse of the Hessian. The advantage of inverse updating is that the search directions are obtained by a matrix{ vector multiplication. Furthermore, experience shows that when the BFGS approximation is appropriately re{...

2015
Mitchell Stern Aryan Mokhtari

We study the problem of click-through rate (CTR) prediction, where the goal is to predict the probability that a user will click on a search advertisement given information about his issued query and account. In this paper, we formulate a model for CTR prediction using logistic regression, then assess the performance of stochastic gradient descent (SGD) and online limited-memory BFGS (oLBFGS) f...

Journal: :Computational Optimization and Applications 2022

The limited memory BFGS (L-BFGS) method is one of the popular methods for solving large-scale unconstrained optimization. Since standard L-BFGS uses a line search to guarantee its global convergence, it sometimes requires large number function evaluations. To overcome difficulty, we propose new with certain regularization technique. We show convergence under usual assumptions. In order make mor...

Journal: :Optimization Methods and Software 2013
Wah June Leong Chuei Yee Chen

A major weakness of the limited memory BFGS (LBFGS) method is that it may converge very slowly on ill-conditioned problems when the identity matrix is used for initialization. Very often, the LBFGS method can adopt a preconditioner on the identity matrix to speed up the convergence. For this purpose, we propose a class of diagonal preconditioners to boost the performance of the LBFGS method. In...

2009
H. AUVINEN J. M. BARDSLEY H. HAARIO

The standard formulations of the Kalman filter (KF) and extended Kalman filter (EKF) require the storage and multiplication of matrices of size n × n, where n is the size of the state space, and the inversion of matrices of size m × m, where m is the size of the observation space. Thus when both m and n are large, implementation issues arise. In this paper, we advocate the use of the limited me...

Journal: :Math. Program. 1989
Dong C. Liu Jorge Nocedal

We study the numerical performance of a limited memory quasi Newton method for large scale optimization which we call the L BFGS method We compare its performance with that of the method developed by Buckley and LeNir which combines cyles of BFGS steps and conjugate direction steps Our numerical tests indicate that the L BFGS method is faster than the method of Buckley and LeNir and is better a...

2004
Richard H Byrd Peihuang Lu Jorge Nocedal Ciyou Zhu

An algorithm for solving large nonlinear optimization problems with simple bounds is de scribed It is based on the gradient projection method and uses a limited memory BFGS matrix to approximate the Hessian of the objective function It is shown how to take advan tage of the form of the limited memory approximation to implement the algorithm e ciently The results of numerical tests on a set of l...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید