نتایج جستجو برای: global gradient algorithm
تعداد نتایج: 1260152 فیلتر نتایج به سال:
based on an eigenvalue analysis, a new proof for the sufficient descent property of the modified polak-ribière-polyak conjugate gradient method proposed by yu et al. is presented.
A new method used to prove the global convergence of the nonlinear conjugate gradient methods, the spectral method, is presented in this paper, and it is applied to a new conjugate gradient algorithm with sufficiently descent property. By analyzing the descent property, several concrete forms of this algorithm are suggested. Under standard Wolfe line searches, the global convergence of the new ...
the stability of learning rate in neural network identifiers and controllers is one of the challenging issues which attracts great interest from researchers of neural networks. this paper suggests adaptive gradient descent algorithm with stable learning laws for modified dynamic neural network (mdnn) and studies the stability of this algorithm. also, stable learning algorithm for parameters of ...
This paper proposes a global learning of neural networks by hybrid optimization algorithm. The hybrid algorithm combines a stochastic approximation with a gradient descent. The stochastic approximation is first applied for estimating an approximation point inclined toward a global escaping from a local minimum, and then the backpropagation(BP) algorithm is applied for high-speed convergence as ...
The particle swarm optimization (PSO) method is an instance of a successful application of the philosophy of bounded rationality and decentralized decision making for solving global optimization problems. A number of advantages with respect to other evolutionary algorithms are attributed to PSO making it a prospective candidate for optimum structural design. The PSO-based algorithm is robust an...
a new meta-heuristic method, based on neuronal communication (nc), is introduced in this article. the neuronal communication illustrates how data is exchanged between neurons in neural system. actually, this pattern works efficiently in the nature. the present paper shows it is the same to find the global minimum. in addition, since few numbers of neurons participate in each step of the method,...
Scalability of Krylov subspace methods suffers from costly global synchronization steps that arise in dot-products and norm calculations on parallel machines. In this work, a modified Conjugate Gradient (CG) method is presented that removes the costly global synchronization steps from the standard CG algorithm by only performing a single non-blocking reduction per iteration. This global communi...
We present a new approach to solving nonlinear complementarity problems based on the normal map and adaptations of the projected gradient algorithm. We characterize a Gauss{Newton point for nonlinear complementarity problems and show that it is suucient to check at most two cells of the related normal manifold to determine such points. Our algorithm uses the projected gradient method on one cel...
A modification of the Dai-Yuan conjugate gradient algorithm is proposed. Using the exact line search, the algorithm reduces to the original version of the Dai and Yuan computational scheme. For inexact line search the algorithm satisfies both the sufficient descent and conjugacy condition. A global convergence result is proved when the Wolfe line search conditions are used. Computational result...
In [1] (Hybrid Conjugate Gradient Algorithm for Unconstrained Optimization J. Optimization. Theory Appl. 141 (2009) 249 - 264), an efficient hybrid conjugate gradient algorithm, the CCOMB algorithm is proposed for solving unconstrained optimization problems. However, the proof of Theorem 2.1 in [1] is incorrect due to an erroneous inequality which used to indicate the descent property for the s...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید