نتایج جستجو برای: hybrid steepest descent method

تعداد نتایج: 1803458  

1994
A. Tenhagen

| In classic backpropagation nets, as introduced by Rumelhart et al. 1], the weights are modiied according to the method of steepest descent. The goal of this weight modiication is to minimise the error in net-outputs for a given training set. Basing upon Jacobs' work 2], we point out drawbacks of steepest descent and suggest improvements on it. These yield a backpropagation net, which adjusts ...

Journal: :Applied Mathematics and Computation 2007
M. Anjidani Sohrab Effati

In this paper we use steepest descent method for solving zero-one nonlinear programming problem. Using penalty function we transform this problem to an unconstrained optimization problem and then by steepest descent method we obtain the original problem optimal solution. 2007 Elsevier Inc. All rights reserved.

2008
George E. Forsythe Gene H. Golub

We consider the special case of the restarted Arnoldi method for approximating the product of a function of a Hermitian matrix with a vector which results when the restart length is set to one. When applied to the solution of a linear system of equations, this approach coincides with the method of steepest descent. We show that the method is equivalent with an interpolation process in which the...

Journal: :IEEE Trans. Evolutionary Computation 1998
Ralf Salomon

Classical gradient methods and evolutionary algorithms represent two very different classes of optimization techniques that seem to have very different properties. This paper discusses some aspects of some “obvious” differences and explores to what extent a hybrid method, the evolutionary-gradient-search procedure, can be used beneficially in the field of continuous parameter optimization. Simu...

1998
Werner Erhard Torsten Fink Michael M. Gutzmann Christoph Rahn Axel Doering Miroslaw Galicki

A recently published idea is to use the A*-Algorithm to optimize the topology of Neural Networks. In this paper, optimization techniques are investigated that combine the A*-Algorithm with diierent parallel training algorithms, namely the backpropagation algorithm and several hybrid algorithms. The hybrid algorithms combine the backpropagation's steepest descent method with diierent sets of gen...

1998
Ralf Salomon

The application of a reasonable selection scheme plays an essential role in virtually all evolutionary algorithms. By not considering less-fit individuals, however, the algorithm discards valuable information about the fitness function. This paper explores to which extent a hybrid method, the evolutionary-gradient-search procedure, that applies a global operator to all offspring can be benefici...

Journal: :Math. Program. 2012
Roger Fletcher

The possibilities inherent in steepest descent methods have been considerably amplified by the introduction of the Barzilai-Borwein choice of step-size, and other related ideas. These methods have proved to be competitive with conjugate gradient methods for the minimization of large dimension unconstrained minimization problems. This paper suggests a method which is able to take advantage of th...

2007
M. AFANASJEW M. EIERMANN O. G. ERNST S. GÜTTEL

We consider the special case of the restarted Arnoldi method for approximating the product of a function of a Hermitian matrix with a vector which results when the restart length is set to one. When applied to the solution of a linear system of equations, this approach coincides with the method of steepest descent. We show that the method is equivalent with an interpolation process in which the...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید