نتایج جستجو برای: steepest descent method

تعداد نتایج: 1645898  

2006
J. KARÁTSON J. W. NEUBERGER

This paper gives a common theoretical treatment for gradient and Newton type methods for general classes of problems. First, for Euler-Lagrange equations Newton’s method is characterized as an (asymptotically) optimal variable steepest descent method. Second, Sobolev gradient type minimization is developed for general problems using a continuous Newton method which takes into account a ‘boundar...

2012
Ram Murty

Published 15 We obtain a new proof of an asymptotic formula for the coefficients of the j-invariant 16 of elliptic curves. Our proof does not use the circle method. We use Laplace's method 17 of steepest descent and the Hardy–Ramanujan asymptotic formula for the partition 18 function. (The latter asymptotic formula can be derived without the circle method.)

Journal: :CoRR 2015
Nikica Hlupic Ivo Beros

An algorithm and associated strategy for solving polynomial systems within the optimization framework is presented. The algorithm and strategy are named, respectively, the penetrating gradient algorithm and the deepest descent strategy. The most prominent feature of penetrating gradient algorithm, after which it was named, is its ability to “see and penetrate through” the obstacles in error spa...

1996
B. Lemaire

The asymptotical limit of the trajectory deened by the continuous steepest descent method for a proper closed convex function f on a Hilbert space is characterized in the set of minimizers of f via an asymp-totical variational principle of Brezis-Ekeland type. The implicit discrete analogue (prox method) is also considered.

2000
Ilya Molchanov Sergei Zuyev

The paper applies abstract optimisation principles in the space of measures within the context of optimal design problems. It is shown that within this framework it is possible to treat various design criteria and constraints in a unified manner providing a “universal” variant of the Kiefer-Wolfowitz theorem and giving a full spectrum of optimality criteria for particular cases. The described s...

2004
Zhigang Zeng De-Shuang Huang Zengfu Wang

This paper analyzes the effect of momentum on steepest descent training for quadratic performance functions. Some global convergence conditions of the steepest descent algorithm are obtained by directly analyzing the exact momentum equations for quadratic cost functions. Those conditions can be directly derived from the parameters (different from eigenvalues that are used in the existed ones.) ...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید