نتایج جستجو برای: conjugate gradient descent
تعداد نتایج: 174860 فیلتر نتایج به سال:
in this paper, the artificial neural network (ann) approach is applied for forecasting groundwater level fluctuation in aghili plain,southwest iran. an optimal design is completed for the two hidden layers with four different algorithms: gradient descent withmomentum (gdm), levenberg marquardt (lm), resilient back propagation (rp), and scaled conjugate gradient (scg). rain,evaporation, relative...
Conjugate gradient methods (CG) constitute excellent neural network training that are simplicity, flexibility, numerical efficiency, and low memory requirements. In this paper, we introduce a new three-term conjugate method, for solving optimization problems it has been tested on artificial networks (ANN) feed-forward network. The method satisfied the descent condition sufficient condition. Glo...
Conjugate gradient methods are a class of important methods for unconstrained optimization, especially when the dimension is large. This paper proposes a new conjugacy condition, which considers an inexact line search scheme but reduces to the old one if the line search is exact. Based on the new conjugacy condition, two nonlinear conjugate gradient methods are constructed. Convergence analysis...
Based on an eigenvalue analysis, a new proof for the sufficient descent property of the modified Polak-Ribière-Polyak conjugate gradient method proposed by Yu et al. is presented.
The conjugate gradient (CG) method is one of the most popular methods for solving smooth unconstrained optimization problems due to its simplicity and low memory requirement. However, the usage of CG methods are mainly restricted in solving smooth optimization problems so far. The purpose of this paper is to present efficient conjugate gradient-type methods to solve nonsmooth optimization probl...
Many large-scale problems in dynamic and stochastic optimization can be modeled with extended linear-quadratic programming, which admits penalty terms and treats them through duality. In general the objective functions in such problems are only piecewise smooth and must be minimized or maximized relative to polyhedral sets of high dimensionality. This paper proposes a new class of numerical met...
In this paper, We propose a new nonlinear conjugate gradient method (FRA) that satisfies sufficient descent condition and global convergence under the inexact line search of strong wolf powell. Our numerical experiment shaw efficiency in solving set problems from CUTEst package, proposed formula gives excellent results at CPU time, number iterations, ratings when compared to WYL, DY, PRP, FR me...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید