نتایج جستجو برای: prp conjugate gradient algorithm

تعداد نتایج: 901220  

1996
Laurence T. Yang

The performance of CGLS, a basic iterative method whose main idea is to organize the computation of conjugate gradient method applied to normal equations for solving least squares problems. On modern architecture is always limited because of the global communication required for inner products. Inner products often therefore present a bottleneck, and it is desirable to reduce or even eliminate ...

2014
Bhavna Sharma K. Venugopalan

Classification is one of the most important task in application areas of artificial neural networks (ANN).Training neural networks is a complex task in the supervised learning field of research. The main difficulty in adopting ANN is to find the most appropriate combination of learning, transfer and training function for the classification task. We compared the performances of three types of tr...

Journal: :Optimization Letters 2013
Anatoly A. Zhigljavsky Luc Pronzato Elena Bukina

We consider gradient algorithms for minimizing a quadratic function in R with large n. We suggest a particular sequence of step-lengthes and demonstrate that the resulting gradient algorithm has a convergence rate comparable with that of Conjugate Gradients and other methods based on the use of Krylov spaces. When the problem is large and sparse, the proposed algorithm can be more efficient tha...

2008
Neculai Andrei

Conjugate gradient algorithms are very powerful methods for solving large-scale unconstrained optimization problems characterized by low memory requirements and strong local and global convergence properties. Over 25 variants of different conjugate gradient methods are known. In this paper we propose a fundamentally different method, in which the well known parameter k β is computed by an appro...

2004
Jadranka Skorin-Kapov Wendy Tang

In this paper we explore different strategies to guide backpropagation algorithm used for training artificial neural networks. Two different variants of steepest descent-based backpropagation algorithm, and four different variants of conjugate gradient algorithm are tested. The variants differ whether or not the time component is used, and whether or not additional gradient information is utili...

Journal: :SIAM Journal on Scientific Computing 2021

On the Convergence Rate of Variants Conjugate Gradient Algorithm in Finite Precision Arithmetic

Journal: :Rairo-operations Research 2022

In this paper, we proposed a new hybrid conjugate gradient algorithm for solving unconstrained optimization problems as convex combination of the Dai-Yuan algorithm, conjugate-descent and Hestenes-Stiefel algorithm. This is globally convergent satisfies sufficient descent condition by using strong Wolfe conditions. The numerical results show that nonlinear efficient robust.

In this paper‎, ‎two extended three-term conjugate gradient methods based on the Liu-Storey ({tt LS})‎ ‎conjugate gradient method are presented to solve unconstrained optimization problems‎. ‎A remarkable property of the proposed methods is that the search direction always satisfies‎ ‎the sufficient descent condition independent of line search method‎, ‎based on eigenvalue analysis‎. ‎The globa...

Journal: :Math. Program. 1978
Albert G. Buckley

Although quasi-Newton algorithms generally converge in fewer iterations than conjugate gradient algorithms, they have the disadvantage of requiring substantially more storage. An algorithm will be described which uses an intermediate (and variable) amount of storage and which demonstrates convergence which is also intermediate, that is, generally better than that observed for conjugate gradient...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید