نتایج جستجو برای: steepest descent

تعداد نتایج: 23254  

Journal: :bulletin of the iranian mathematical society 2013
s. saeidi h. haydari

let $x$ be a reflexive banach space, $t:xto x$ be a nonexpansive mapping with $c=fix(t)neqemptyset$ and $f:xto x$ be $delta$-strongly accretive and $lambda$- strictly pseudocotractive with $delta+lambda>1$. in this paper, we present modified hybrid steepest-descent methods, involving sequential errors and functional errors with functions admitting a center, which generate convergent sequences t...

1998
David Helmbold

AdaBoost is a popular and eeective leveraging procedure for improving the hypotheses generated by weak learning algorithms. AdaBoost and many other leveraging algorithms can be viewed as performing a constrained gradient descent over a potential function. At each iteration the distribution over the sample given to the weak learner is the direction of steepest descent. We introduce a new leverag...

Journal: :Journal of Physics: Conference Series 2019

Journal: :Bulletin of the London Mathematical Society 2006

Journal: :Journal of computational chemistry 2011
Daniel Sheppard Graeme Henkelman

A recent letter to the editor (Quapp and Bofill, J Comput Chem 2010, 31, 2526) claims that the nudged elastic band (NEB) method can converge toward gradient extremal paths and not to steepest descent paths, as has been assumed. Here, we show that the NEB does in fact converge to steepest descent paths and that the observed tendency for the NEB to approach gradient extremal paths was a consequen...

2007
Satoko MORIGUCHI Nobuyuki TSUCHIMURA

We consider the problem of minimizing a nonlinear discrete function with L-/M-convexity proposed in the theory of discrete convex analysis. For this problem, steepest descent algorithms and steepest descent scaling algorithms are known. In this paper, we use continuous relaxation approach which minimizes the continuous variable version first in order to find a good initial solution of a steepes...

Journal: :SIAM Journal on Optimization 2010
Coralia Cartis Nicholas I. M. Gould Philippe L. Toint

It is shown that the steepest descent and Newton’s method for unconstrained nonconvex optimization under standard assumptions may be both require a number of iterations and function evaluations arbitrarily close to O(ǫ) to drive the norm of the gradient below ǫ. This shows that the upper bound of O(ǫ) evaluations known for the steepest descent is tight, and that Newton’s method may be as slow a...

2011
Qing-Hua Zhao Liang Li Hua Li Kunpeng Zhang Huakui Wang

This article studies the classical MDS and dwMDS location algorithm.On this basis, steepest descent algorithm is introduced to replace SMACOF algorithm as optimization objective function. The results show that the steepest descent method as the optimization objective function is simple and easy to implement. Compared with the dwMDS method based on SMACOF algorithm, the distributed MDS positioni...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید