نتایج جستجو برای: limited memory bfgs

تعداد نتایج: 672103  

Journal: :SIAM J. Scientific Computing 2010
Berkant Savas Lek-Heng Lim

In this paper we proposed quasi-Newton and limited memory quasi-Newton methods for objective functions defined on Grassmann manifolds or a product of Grassmann manifolds. Specifically we defined bfgs and l-bfgs updates in local and global coordinates on Grassmann manifolds or a product of these. We proved that, when local coordinates are used, our bfgs updates on Grassmann manifolds share the s...

2011
Quoc V. Le Jiquan Ngiam Adam Coates Ahbik Lahiri Bobby Prochnow Andrew Y. Ng

The predominant methodology in training deep learning advocates the use of stochastic gradient descent methods (SGDs). Despite its ease of implementation, SGDs are difficult to tune and parallelize. These problems make it challenging to develop, debug and scale up deep learning algorithms with SGDs. In this paper, we show that more sophisticated off-the-shelf optimization methods such as Limite...

Journal: :SIAM J. Scientific Computing 1995
Richard H. Byrd Peihuang Lu Jorge Nocedal Ciyou Zhu

An algorithm for solving large nonlinear optimization problems with simple bounds is de scribed It is based on the gradient projection method and uses a limited memory BFGS matrix to approximate the Hessian of the objective function It is shown how to take advan tage of the form of the limited memory approximation to implement the algorithm e ciently The results of numerical tests on a set of l...

2011
Richard H. Byrd Peihuang Lu

An algorithm for solving large nonlinear optimization problems with simple bounds is described. It is based on the gradient projection method and uses a limited-memory BFGS matrix to approximate the Hessian of the objective function. We show how to take advantage of the form of the limited-memory approximation to implement the algorithm eeciently. The results of numerical tests on a set of larg...

1994
Richard H. Byrd Peihuang Lu

An algorithm for solving large nonlinear optimization problems with simple bounds is described. It is based on the gradient projection method and uses a limited-memory BFGS matrix to approximate the Hessian of the objective function. We show how to take advantage of the form of the limited-memory approximation to implement the algorithm eeciently. The results of numerical tests on a set of larg...

Journal: :Frontiers in Applied Mathematics and Statistics 2021

The limited-memory Broyden–Fletcher–Goldfarb–Shanno (L-BFGS) optimization method performs very efficiently for large-scale problems. A trust region search generally more and robustly than a line method, especially when the gradient of objective function cannot be accurately evaluated. computational cost an L-BFGS subproblem (TRS) solver depend mainly on number unknown variables ( n ) variable s...

2012
Tony Jebara Anna Choromanska

The partition function plays a key role in probabilistic modeling including condi-tional random fields, graphical models, and maximum likelihood estimation. Tooptimize partition functions, this article introduces a quadratic variational upperbound. This inequality facilitates majorization methods: optimization of com-plicated functions through the iterative solution of simpler s...

2001
JEAN CHARLES GILBERT J. NOCEDAL

It is shown that the two-loop recursion for computing the search direction of a limited memory method for optimization can be derived by means of the reverse mode of automatic differentiation applied to an auxiliary function.

2007
Nicol N. Schraudolph Jin Yu Simon Günter

We develop stochastic variants of the wellknown BFGS quasi-Newton optimization method, in both full and memory-limited (LBFGS) forms, for online optimization of convex functions. The resulting algorithm performs comparably to a well-tuned natural gradient descent but is scalable to very high-dimensional problems. On standard benchmarks in natural language processing, it asymptotically outperfor...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید