نتایج جستجو برای: limited memory bfgs

تعداد نتایج: 672103  

Journal: :CoRR 2017
Renbo Zhao William B. Haskell Vincent Y. F. Tan

We revisit the stochastic limited-memory BFGS (L-BFGS) algorithm. By proposing a new framework for analyzing convergence, we theoretically improve the (linear) convergence rates and computational complexities of the stochastic LBFGS algorithms in previous works. In addition, we propose several practical acceleration strategies to speed up the empirical performance of such algorithms. We also pr...

Journal: :SIAM Journal on Optimization 2016
Chungen Shen Lei-Hong Zhang Wei Hong Yang

In this paper, we propose a filter active-set algorithm for the minimization problem over a product of multiple ball/sphere constraints. By making effective use of the special structure of the ball/sphere constraints, a new limited memory BFGS (L-BFGS) scheme is presented. The new L-BFGS implementation takes advantage of the sparse structure of the Jacobian of the constraints, and generates cur...

2013
Tsung-Yi Lin Chen-Yu Lee

Logistic regression is a technique to map the input feature to the posterior probability for a binary class. The optimal parameter of regression function is obtained by maximizing log likelihood of training data. In this report, we implement two optimization techniques 1) stochastic gradient decent (SGD); 2) limited-memory BroydenFletcherGoldfarbShanno (L-BFGS) to optimize the log likelihood fu...

2008
Jin Yu

We extend the well-known BFGS quasi-Newton method and its limited-memory variant (LBFGS) to the optimization of nonsmooth convex objectives. This is done in a rigorous fashion by generalizing three components of BFGS to subdifferentials: The local quadratic model, the identification of a descent direction, and the Wolfe line search conditions. We apply the resulting subLBFGS algorithm to L2-reg...

2012
Justin Domke

“Energy” models for continuous domains can be applied to many problems, but often suffer from high computational expense in training, due to the need to repeatedly minimize the energy function to high accuracy. This paper considers a modified setting, where the model is trained in terms of results after optimization is truncated to a fixed number of iterations. We derive “backpropagating” versi...

1997
Tamara Gibson Kolda

The focus of this dissertation is on matrix decompositions that use a limited amount of computer memory, thereby allowing problems with a very large number of variables to be solved. Speci cally, we will focus on two applications areas: optimization and information retrieval. We introduce a general algebraic form for the matrix update in limited-memory quasiNewton methods. Many well-known metho...

Journal: :Optimization Methods and Software 2009
A. K. Alekseev Ionel Michael Navon J. L. Steward

We compare the performance of several robust large-scale minimization algorithms for the unconstrained minimization of an ill-posed inverse problem. The parabolized Navier-Stokes equations model was used for adjoint parameter estimation. The methods compared consist of two versions of the nonlinear conjugate gradient method (CG), Quasi-Newton (BFGS), the limited memory Quasi-Newton (L-BFGS) [15...

1999
Fabrice VEERSÉ

Two approximations of the Hessian matrix as limited-memory operators are built from the limited-memoryBFGS inverse Hessian approximationprovided by the minimization code, in view of the speci cation of the inverse analysis/forecast error covariance matrix in variational data assimilation. Some numerical experiments and theoretical considerations lead to reject the limited-memory DFP Hessian app...

2011
Lennart Frimannslund Trond Steihaug

We present a class of methods which is a combination of the limited memory BFGS method and the truncated Newton method. Each member of the class is defined by the (possibly dynamic) number of vector pairs of the L-BFGS method and the forcing sequence of the truncated Newton method. We exemplify with a scheme which makes the hybrid method perform like the L-BFGS method far from the solution, and...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید