نتایج جستجو برای: limited memory bfgs

تعداد نتایج: 672103  

2008
ADRIAN S. LEWIS MICHAEL L. OVERTON

We investigate the BFGS algorithm with an inexact line search when applied to nonsmooth functions, not necessarily convex. We define a suitable line search and show that it generates a sequence of nested intervals containing points satisfying the Armijo and weak Wolfe conditions, assuming only absolute continuity. We also prove that the line search terminates for all semi-algebraic functions. T...

Journal: :Optimization Letters 2011
M. S. Apostolopoulou D. G. Sotiropoulos C. A. Botsaris Panayiotis E. Pintelas

We present a nearly-exact method for the large scale trust region subproblem (TRS) based on the properties of the minimal-memory BFGS method. Our study in concentrated in the case where the initial BFGS matrix can be any scaled identity matrix. The proposed method is a variant of the Moré-Sorensen method that exploits the eigenstructure of the approximate Hessian B, and incorporates both the st...

2000
Carmine Di Fiore Stefano Fanelli Filomena Lepore Paolo Zellini

In this paper a new class of quasi-Newton methods, named LQN, is introduced in order to solve unconstrained minimization problems. The novel approach, which generalizes classical BFGS methods, is based on a Hessian updating formula involving an algebra L of matrices simultaneously diagonalized by a fast unitary transform. The complexity per step of LQN methods is O(n log n), thereby improving c...

2018
Yong Li Gonglin Yuan Zhou Sheng

It is well known that the active set algorithm is very effective for smooth box constrained optimization. Many achievements have been obtained in this field. We extend the active set method to nonsmooth box constrained optimization problems, using the Moreau-Yosida regularization technique to make the objective function smooth. A limited memory BFGS method is introduced to decrease the workload...

Journal: :Numerical Lin. Alg. with Applic. 2016
Serge Gratton Sylvain Mercier Nicolas Tardieu Xavier Vasseur

This talk will focus on preconditioning techniques for solving sequences of symmetric indefinite systems Aixi = bi, where the matrices Ai are supposed to slightly change. This case often arises in Computational Science and Engineering, for example when a Newton-type method is used to solve a nonlinear problem. With this aim in mind, we study the case of a sequence with a given matrix A ∈ Rn×n a...

Journal: :SIAM Journal on Optimization 1999
Linda Kaufman

In this paper we consider several algorithms for reducing the storage when using a quasi-Newton method in a dogleg–trust region setting for minimizing functions of many variables. Secant methods require O(n2) locations to store an approximate Hessian and O(n2) operations per iteration when minimizing a function of n variables. This storage requirement becomes impractical when n becomes large. O...

Journal: :IEICE Electronics Express 2023

The quasi-newton methods are one of the most effective for solving unconstrained optimization problems. This letter provides a hardware-efficient massive multiple-input multiple-output (MIMO) detector using an improved (IQN) method. Due to similarity in stepsize calculation Barzilai-Borwein and limited-memory BFGS, two deeply fused proposed IQN algorithm higher convergence speed. corresponding ...

Journal: :Optimization Letters 2015
Björn Sachsenberg Klaus Schittkowski

We consider a combined IPM-SQP method to solve smooth nonlinear optimization problems, which may possess a large number of variables and a sparse Jacobian matrix of the constraints. Basically, the algorithm is a sequential quadratic programming (SQP) method, where the quadratic programming subproblem is solved by a primal-dual interior point method (IPM). A special feature of the algorithm is t...

2016
Hongzhou Lin Julien Mairal Zaid Harchaoui

We propose an approach to accelerate gradient-based optimization algorithms by giving them the ability to exploit curvature information using quasi-Newton update rules. The proposed scheme, called QuickeNing, is generic and can be applied to a large class of first-order methods such as incremental and block-coordinate algorithms; it is also compatible with composite objectives, meaning that it ...

2016
Hongzhou Lin Julien Mairal Zaid Harchaoui

We propose a generic approach to accelerate gradient-based optimization algorithms with quasiNewton principles. The proposed scheme, called QuickeNing, can be applied to incremental first-order methods such as stochastic variance-reduced gradient (SVRG) or incremental surrogate optimization (MISO). It is also compatible with composite objectives, meaning that it has the ability to provide exact...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید