نتایج جستجو برای: projected structured hessian update

تعداد نتایج: 231982  

Journal: :Math. Program. 2006
Yu-Hong Dai Roger Fletcher

There are many applications related to singly linearly constrained quadratic programs subjected to upper and lower bounds. In this paper, a new algorithm based on secant approximation is provided for the case in which the Hessian matrix is diagonal and positive definite. To deal with the general case where the Hessian is not diagonal, a new efficient projected gradient algorithm is proposed. Th...

Journal: :SIAM Journal on Optimization 2007
Chuanhai Liu Scott A. Vander Wiel

A new method for quasi-Newton minimization outperforms BFGS by combining least-change updates of the Hessian with step sizes estimated from a Wishart model of uncertainty. The Hessian update is in the Broyden family but uses a negative parameter, outside the convex range, that is usually regarded as the safe zone for Broyden updates. Although full Newton steps based on this update tend to be to...

2011
YoungSuk Bang Hany S. Abdel-Khalik

When nonlinear behavior must be considered in sensitivity analysis studies, one needs to approximate higher order derivatives of the response of interest with respect to all input data. This paper presents an application of a general reduced order method to constructing higher order derivatives of response of interest with respect to all input data. In particular, we apply the method to constru...

Journal: :SIAM Journal on Optimization 1999
Linda Kaufman

In this paper we consider several algorithms for reducing the storage when using a quasi-Newton method in a dogleg–trust region setting for minimizing functions of many variables. Secant methods require O(n2) locations to store an approximate Hessian and O(n2) operations per iteration when minimizing a function of n variables. This storage requirement becomes impractical when n becomes large. O...

Journal: :Comp. Opt. and Appl. 2015
Anders Forsgren Tove Odland

It is well known that the conjugate gradient method and a quasi-Newton method, using any well-defined update matrix from the one-parameter Broyden family of updates, produce the same iterates on a quadratic problem with positive-definite Hessian. This equivalence does not hold for any quasi-Newton method. We discuss more precisely the conditions on the update matrix that give rise to this behav...

Journal: :CoRR 2013
Ryan Kiros

Hessian-free (HF) optimization has been successfully used for training deep autoencoders and recurrent networks. HF uses the conjugate gradient algorithm to construct update directions through curvature-vector products that can be computed on the same order of time as gradients. In this paper we exploit this property and study stochastic HF with gradient and curvature mini-batches independent o...

2005

Quasi-Newton algorithms for unconstrained nonlinear minimization generate a sequence of matrices that can be considered as approximations of the objective function second derivatives. This paper gives conditions under which these approximations can be proved to converge globally to the true Hessian matrix, in the case where the Symmetric Rank One update formula is used. The rate of convergence ...

Journal: :Math. Comput. 2017
Hailong Guo Zhimin Zhang Ren Zhao

In this article, we propose and analyze an effective Hessian recovery strategy for the Lagrangian finite element method of arbitrary order. We prove that the proposed Hessian recovery method preserves polynomials of degree k + 1 on general unstructured meshes and superconverges at a rate of O(hk) on mildly structured meshes. In addition, the method is proved to be ultraconvergent (two orders hi...

2015
Zengru Cui Gonglin Yuan Zhou Sheng Wenjie Liu Xiaoliang Wang Xiabin Duan Lixiang Li

This paper proposes a modified BFGS formula using a trust region model for solving nonsmooth convex minimizations by using the Moreau-Yosida regularization (smoothing) approach and a new secant equation with a BFGS update formula. Our algorithm uses the function value information and gradient value information to compute the Hessian. The Hessian matrix is updated by the BFGS formula rather than...

Journal: :Journal of Physics: Conference Series 2021

In this work, we propose modification for PSB update with a new extended Quasi – Newton condition unconstrained optimization problem, so it's called (α PSB) method. kind of rank two update, which solve the but can't guarantee positive definite property Hessian matrix. matrix be confirmed by updating vector sk, represent difference between next gradient and current objective function assume to c...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید