نتایج جستجو برای: projected structured hessian update

تعداد نتایج: 231982  

2012
Robert Seidl Thomas Huckle Michael Bader

Recently Martens adapted the Hessian-free optimization method for the training of deep neural networks. One key aspect of this approach is that the Hessian is never computed explicitly, instead the Conjugate Gradient(CG) Algorithm is used to compute the new search direction by applying only matrix-vector products of the Hessian with arbitrary vectors. This can be done efficiently using a varian...

Journal: :SIAM Journal on Optimization 1995
Henry Wolkowicz Qing Zhao

Least change secant methods, for function minimization, depend on nding a \good" symmetric positive deenite update to approximate the Hessian. This update contains new curvature information while simultaneously preserving, as much as possible, the built up information from the previous update. Updates are generally derived using measures of least change based on some function of the eigen-value...

Journal: :journal of sciences, islamic republic of iran 2015
n. aghazadeh y. gholizade atani

in this paper, we present an edge detection method based on wavelet transform and hessian matrix of image at each pixel. many methods which based on wavelet transform, use wavelet transform to approximate the gradient of image and detect edges by searching the modulus maximum of gradient vectors. in our scheme, we use wavelet transform to approximate hessian matrix of image at each pixel, too. ...

Journal: :SIAM J. Scientific Computing 2014
Alexander G. Kalmikov Patrick Heimbach

Derivative-based methods are developed for uncertainty quantification (UQ) in largescale ocean state estimation. The estimation system is based on the adjoint method for solving a least-squares optimization problem, whereby the state-of-the-art MIT general circulation model (MITgcm) is fit to observations. The UQ framework is applied to quantify Drake Passage transport uncertainties in a global...

Journal: :SIAM Journal on Optimization 2008
Radek Kucera

A new active set algorithm for minimizing quadratic functions with separable convex constraints is proposed by combining the conjugate gradient method with the projected gradient. It generalizes recently developed algorithms of quadratic programming constrained by simple bounds. A linear convergence rate in terms of the Hessian spectral condition number is proven. Numerical experiments, includi...

Journal: :SIAM J. Matrix Analysis Applications 2015
Sylvain Arguillère

The symmetric rank-one update method is well-known in optimization for its applications in the quasi-Newton algorithm. In particular, Conn, Gould, and Toint proved in 1991 that the matrix sequence resulting from this method approximates the Hessian of the minimized function. Extending their idea, we prove that the symmetric rank-one update algorithm can be used to approximate any sequence of sy...

Journal: :Baghdad Science Journal 2022

Several attempts have been made to modify the quasi-Newton condition in order obtain rapid convergence with complete properties (symmetric and positive definite) of inverse Hessian matrix (second derivative objective function). There are many unconstrained optimization methods that do not generate definiteness matrix. One those is symmetric rank 1( H-version) update (SR1 update), where this sat...

2014
Jascha Sohl-Dickstein Ben Poole Surya Ganguli

We present an algorithm for minimizing a sum of functions that combines the computational efficiency of stochastic gradient descent (SGD) with the second order curvature information leveraged by quasi-Newton methods. We unify these disparate approaches by maintaining an independent Hessian approximation for each contributing function in the sum. We maintain computational tractability and limit ...

2006
Matthias Seeger

We propose a highly efficient framework for kernel multi-class models with a largeand structured set of classes, and more general for penalized likelihood kernel methods.As opposed to many previous approaches which try to decompose the fitting probleminto many smaller ones, we focus on a Newton optimization of the complete model,making use of model structure and linear conjugate...

2008
J. Vlček L. Lukšan

The Broyden class of quasi-Newton updates for inverse Hessian approximation are transformed to the formal BFGS update, which makes possible to generalize the well-known Nocedal method based on the Strang recurrences to the scaled limited-memory Broyden family, using the same number of stored vectors as for the limited-memory BFGS method. Two variants are given, the simpler of them does not requ...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید