نتایج جستجو برای: projected structured hessian update

تعداد نتایج: 231982  

2010
Angelo Lucia ANGELO LUCIA

A new quasi-Newton updating formula for sparse optimization calculations is presented. It makes combined use of a simple strategy for fixing symmetry and a Schubert correction to the upper triangle of a permuted Hessian approximation. Interesting properties of this new update are that it is closed form and that it does not satisfy the secant condition at every iteration of the calculations. Som...

Journal: :iranian journal of science and technology (sciences) 2006
m. bektas

in this paper, we obtain two intrinsic integral inequalities of hessian manifolds.

2006

A typical iteration starts at the current point x where nz (say) variables are free from both their bounds. The projected gradient vector gz, whose elements are the derivatives of F ðxÞ with respect to the free variables, is known. A unit lower triangular matrix L and a diagonal matrix D (both of dimension nz), such that LDL is a positive-definite approximation of the matrix of second derivativ...

1997
MOODY T CHU NICKOLAY T TRENDAFILOV

Two data analysis problems the orthonormal Procrustes problem and the Penrose regression problem are reconsidered in this paper These problems are known in the literature for their importance as well as di culty This work presents a way to calculate the projected gradient and the projected Hessian explicitly One immediate result of this calculation is the complete characterization of the rst or...

Journal: :CoRR 2017
Maxim Naumov

In this paper we focus on the linear algebra theory behind feedforward (FNN) and recurrent (RNN) neural networks. We review backward propagation, including backward propagation through time (BPTT). Also, we obtain a new exact expression for Hessian, which represents second order effects. We show that for t time steps the weight gradient can be expressed as a rank-t matrix, while the weight Hess...

2001
Ödön Farkas Bernhard Schlegel

The geometry optimization using direct inversion in the iterative subspace (GDIIS) has been implemented in a number of computer programs and is found to be quite efficient in the quadratic vicinity of a minimum. However, far from a minimum, the original method may fail in three typical ways: (a) convergence to a nearby critical point of higher order (e.g. transition structure), (b) oscillation ...

Journal: :Journal of Computational and Applied Mathematics 2010

2008
J. Vlček L. Lukšan

A new family of limited-memory variable metric or quasi-Newton methods for unconstrained minimization is given. The methods are based on a positive definite inverse Hessian approximation in the form of the sum of identity matrix and two low rank matrices, obtained by the standard scaled Broyden class update. To reduce the rank of matrices, various projections are used. Numerical experience is e...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید