نتایج جستجو برای: conjugate gradient descent
تعداد نتایج: 174860 فیلتر نتایج به سال:
In this paper, by using the smoothing Fischer-Burmeister function, we present a new smoothing conjugate gradient method for solving the nonlinear nonsmooth complementarity problems. The line search which we used guarantees the descent of the method. Under suitable conditions, the new smoothing conjugate gradient method is proved globally convergent. Finally, preliminary numerical experiments sh...
In this paper, based on the efficient Conjugate Descent ({\tt CD}) method, two generalized {\tt CD}algorithms are proposed to solve unconstrained optimization problems.These methods three-term conjugate gradient which generateddirections by using parameters and independent of line searchsatisfy in sufficient descent condition. Furthermore, under strong Wolfe search,the global convergence proved...
A multi-layer neural network with multiple hidden layers was trained as an autoencoder using steepest descent, scaled conjugate gradient and alopex algorithms. These algorithms were used in different combinations with steepest descent and alopex used as pretraining algorithms followed by training using scaled conjugate gradient. All the algorithms were also used to train the autoencoders withou...
In this paper we explore different strategies to guide backpropagation algorithm used for training artificial neural networks. Two different variants of steepest descent-based backpropagation algorithm, and four different variants of conjugate gradient algorithm are tested. The variants differ whether or not the time component is used, and whether or not additional gradient information is utili...
A new method used to prove the global convergence of the nonlinear conjugate gradient methods, the spectral method, is presented in this paper, and it is applied to a new conjugate gradient algorithm with sufficiently descent property. By analyzing the descent property, several concrete forms of this algorithm are suggested. Under standard Wolfe line searches, the global convergence of the new ...
Many scientific applications require one to solve successively linear systems Ax = b with different right-hand sides b and a symmetric positive definite matrix A. The conjugate gradient method applied to the first system generates a Krylov subspace which can be efficiently recycled thanks to orthogonal projections in subsequent systems. A modified conjugate gradient method is then applied with ...
This paper proposes a line search technique to satisfy a relaxed form of the strong Wolfe conditions in order to guarantee the descent condition at each iteration of the Polak-Ribière-Polyak conjugate gradient algorithm. It is proved that this line search algorithm preserves the usual convergence properties of any descent algorithm. In particular, it is shown that the Zoutendijk condition holds...
The purpose of this study is to analyze the performance of Back propagation algorithm with changing training patterns and the second momentum term in feed forward neural networks. This analysis is conducted on 250 different words of three small letters from the English alphabet. These words are presented to two vertical segmentation programs which are designed in MATLAB and based on portions (1...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید