نتایج جستجو برای: descent method

تعداد نتایج: 1645212  

Journal: :Proceedings of the Japan Academy, Series A, Mathematical Sciences 2003

Journal: :Optimization Letters 2023

Abstract We describe a special class of quasi-equilibrium problems in metric spaces and propose novel simple threshold descent method for solving these problems. Due to the framework, convergence cannot be established with usual convexity or generalized assumptions. Under mild conditions, iterative procedure gives solutions problem. apply this scalar vector some classes relative optimization

Journal: :Applied Mathematics and Computation 2007
Abdellah Bnouhachem Muhammad Aslam Noor Mohamed Khalfaoui

In this paper, we propose a modified descent-projection method for solving variational inequalities. The method makes use of a descent direction to produce the new iterate and can be viewed as an improvement of the descent-projection method by using a new step size. Under certain conditions, the global convergence of the proposed method is proved. In order to demonstrate the efficiency of the p...

1995
Kim W. C. Ku M. W. Mak W. C. Siu

Recurrent neural networks (RNNs), with the capability of dealing with spatio-temporal relationship, are more complex than feed-forward neural networks. Training of RNNs by gradient descent methods becomes more dii-cult. Therefore, another training method, which uses cellular genetic algorithms, is proposed. In this paper, the performance of training by a gradient descent method is compared with...

2002
Tatsuya KOIKE Yoshitsugu TAKEI

The exact steepest descent method was born in [AKT4] by combining the ordinary steepest descent method with the exact WKB analysis. (See, e.g., [AKT2] for the notion and notations of the exact WKB analysis used in this report.) It is a straightforward generalization of the ordinary steepest descent method and provides us with a new powerful tool for the description of Stokes curves as well as f...

Journal: :CoRR 2008
Hui Huang Uri M. Ascher

Much recent attention has been devoted to gradient descent algorithms where the steepest descent step size is replaced by a similar one from a previous iteration or gets updated only once every second step, thus forming a faster gradient descent method. For unconstrained convex quadratic optimization these methods can converge much faster than steepest descent. But the context of interest here ...

2012
Sun Min

In this paper, a new conjugate conjugate method with sufficient descent property is proposed for the unconstrained optimization problem. An attractive property of the new method is that the descent direction generated by the method always possess the sufficient descent property, and this property is independent of the line search used and the choice of ki  . Under mild conditions, the global c...

Journal: :Math. Program. 2012
Roger Fletcher

The possibilities inherent in steepest descent methods have been considerably amplified by the introduction of the Barzilai-Borwein choice of step-size, and other related ideas. These methods have proved to be competitive with conjugate gradient methods for the minimization of large dimension unconstrained minimization problems. This paper suggests a method which is able to take advantage of th...

2007
M. AFANASJEW M. EIERMANN O. G. ERNST S. GÜTTEL

We consider the special case of the restarted Arnoldi method for approximating the product of a function of a Hermitian matrix with a vector which results when the restart length is set to one. When applied to the solution of a linear system of equations, this approach coincides with the method of steepest descent. We show that the method is equivalent with an interpolation process in which the...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید