نتایج جستجو برای: hessian matrix

تعداد نتایج: 366902  

Journal: :SIAM Journal on Optimization 1999
Linda Kaufman

In this paper we consider several algorithms for reducing the storage when using a quasi-Newton method in a dogleg–trust region setting for minimizing functions of many variables. Secant methods require O(n2) locations to store an approximate Hessian and O(n2) operations per iteration when minimizing a function of n variables. This storage requirement becomes impractical when n becomes large. O...

2012
Tan Bui-Thanh Omar Ghattas

We derive expressions for the shape Hessian operator of the data misfit functional corresponding to the inverse problem of inferring the shape of a scatterer from reflected acoustic waves, using a Banach space setting and the Lagrangian approach. The shape Hessian is then analyzed in both Hölder and Sobolev spaces. Using an integral equation approach and compact embeddings in Hölder and Sobolev...

2012
Tan Bui-Thanh Omar Ghattas

We derive expressions for the shape Hessian operator of the data misfit functional corresponding to the inverse problem of inferring the shape of a scatterer from reflected acoustic waves, using a Banach space setting and the Lagrangian approach. The shape Hessian is then analyzed in both Hölder and Sobolev spaces. Using an integral equation approach and compact embeddings in Hölder and Sobolev...

2010
Subodh Iyengar

Optimization techniques used in Machine Learning play an important role in the training of the Neural Network in regression and classification tasks. Predominantly, first order optimization methods such as Gradient Descent have been used in the training of Neural Networks, since second order methods, such as Newton’s method, are computationally infeasible. However, second order methods show muc...

2001
Xun Zhu

This paper proposes a modification to the simultaneous per tu rba t ion stochastic approximation (SPSA) methods based on the comparisons made between the first o rder and the second order SPSA (1SPSA and 2SPSA) algori thms f rom the perspective of loss function Hessian. At finite iterations, the convergence rate depends on the matr ix conditioning of the loss function Hessian. It is shown that ...

Journal: :International Journal of Signal Processing, Image Processing and Pattern Recognition 2013

Journal: :Neural computation 2015
Chien-Chih Wang Chun-Heng Huang Chih-Jen Lin

Newton methods can be applied in many supervised learning approaches. However, for large-scale data, the use of the whole Hessian matrix can be time-consuming. Recently, subsampled Newton methods have been proposed to reduce the computational time by using only a subset of data for calculating an approximation of the Hessian matrix. Unfortunately, we find that in some situations, the running sp...

2008
Ivan Izmestiev

The Colin de Verdière number μ(G) of a graph G is the maximum corank of a Colin de Verdière matrix for G (that is, of a Schrödinger operator on G with a single negative eigenvalue). In 2001, Lovász gave a construction that associated to every convex 3-polytope a Colin de Verdière matrix of corank 3 for its 1-skeleton. We generalize the Lovász construction to higher dimensions by interpreting it...

2008

An automatic time step size determination for non-linear problems, solved by implicit schemes, is presented. The time step calculation is based on the estimation of the integration error. This estimation is calculated from the acceleration difference. It is normalised according to the size of the problem and the integration parameters. This time step control algorithm modifies the time step siz...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید