نتایج جستجو برای: bfgs method

تعداد نتایج: 1630302  

Journal: :Journal of computational chemistry 2003
B. Das Hagai Meirovitch Ionel Michael Navon

Energy minimization plays an important role in structure determination and analysis of proteins, peptides, and other organic molecules; therefore, development of efficient minimization algorithms is important. Recently, Morales and Nocedal developed hybrid methods for large-scale unconstrained optimization that interlace iterations of the limited-memory BFGS method (L-BFGS) and the Hessian-free...

2006
Tamara Gibson Kolda

The focus of this dissertation is on matrix decompositions that use a limited amount of computer memory, thereby allowing problems with a very large number of variables to be solved. Speciically, we will focus on two applications areas: optimization and information retrieval. We introduce a general algebraic form for the matrix update in limited-memory quasi-Newton methods. Many well-known meth...

Journal: :CoRR 2018
Raghu Bollapragada Dheevatsa Mudigere Jorge Nocedal Hao-Jun Michael Shi Ping Tak Peter Tang

The standard L-BFGS method relies on gradient approximations that are not dominated by noise, so that search directions are descent directions, the line search is reliable, and quasi-Newton updating yields useful quadratic models of the objective function. All of this appears to call for a full batch approach, but since small batch sizes give rise to faster algorithms with better generalization...

1996
JAMES V. BURKE ANDREAS WIEGMANN

The limited memory BFGS method pioneered by Jorge Nocedal is usually implemented as a line search method where the search direction is computed from a BFGS approximation to the inverse of the Hessian. The advantage of inverse updating is that the search directions are obtained by a matrix{ vector multiplication. Furthermore, experience shows that when the BFGS approximation is appropriately re{...

Journal: :CoRR 2015
Nicolas Ray Dmitry Sokolov

L-BFGS is a hill climbing method that is guarantied to converge only for convex problems. In computer graphics, it is often used as a black box solver for a more general class of non linear problems, including problems having many local minima. Some works obtain very nice results by solving such difficult problems with L-BFGS. Surprisingly, the method is able to escape local minima: our interpr...

1997
Tamara Gibson Kolda

The focus of this dissertation is on matrix decompositions that use a limited amount of computer memory, thereby allowing problems with a very large number of variables to be solved. Speciically, we will focus on two applications areas: optimization and information retrieval. We introduce a general algebraic form for the matrix update in limited-memory quasi-Newton methods. Many well-known meth...

Journal: :SIAM Journal on Optimization 1998
Robert Mifflin Defeng Sun Liqun Qi

In this paper we provide implementable methods for solving nondifferentiable convex optimization problems. A typical method minimizes an approximate Moreau–Yosida regularization using a quasi-Newton technique with inexact function and gradient values which are generated by a finite inner bundle algorithm. For a BFGS bundle-type method global and superlinear convergence results for the outer ite...

2009
Gonglin Yuan Xiwen Lu Zengxin Wei

In most cases authors are permitted to post their version of the article (e.g. in Word or Tex form) to their personal website or institutional repository. Authors requiring further information regarding Elsevier's archiving and manuscript policies are encouraged to visit: a b s t r a c t In this paper, we propose a BFGS trust-region method for solving symmetric nonlinear equations. The global c...

2016
Albert S. Berahas Jorge Nocedal Martin Takác

The question of how to parallelize the stochastic gradient descent (SGD) method has received much attention in the literature. In this paper, we focus instead on batch methods that use a sizeable fraction of the training set at each iteration to facilitate parallelism, and that employ second-order information. In order to improve the learning process, we follow a multi-batch approach in which t...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید