نتایج جستجو برای: conjugate gradient descent

تعداد نتایج: 174860  

2012

I compare two common techniques to compute matrix factorizations for recommender systems, specifically using the Netflix prize data set. Accuracy, run-time, and scalability are discussed for stochastic gradient descent and non-linear conjugate gradient.

2008
Neculai Andrei

New accelerated nonlinear conjugate gradient algorithms which are mainly modifications of the Dai and Yuan’s for unconstrained optimization are proposed. Using the exact line search, the algorithm reduces to the Dai and Yuan conjugate gradient computational scheme. For inexact line search the algorithm satisfies the sufficient descent condition. Since the step lengths in conjugate gradient algo...

Journal: :Appl. Math. Lett. 2008
Neculai Andrei

A modification of the Dai-Yuan conjugate gradient algorithm is proposed. Using the exact line search, the algorithm reduces to the original version of the Dai and Yuan computational scheme. For inexact line search the algorithm satisfies both the sufficient descent and conjugacy condition. A global convergence result is proved when the Wolfe line search conditions are used. Computational result...

2015
Qiong Li Junfeng Yang

Conjugate gradient methods are efficient for smooth optimization problems, while there are rare conjugate gradient based methods for solving a possibly nondifferentiable convex minimization problem. In this paper by making full use of inherent properties of Moreau-Yosida regularization and descent property of modified conjugate gradient method we propose a modified Fletcher-Reeves-type method f...

2012
N. M. Nawi M. R. Ransing R. S. Ransing

The conjugate gradient optimization algorithm usually used for nonlinear least squares is presented and is combined with the modified back propagation algorithm yielding a new fast training multilayer perceptron (MLP) algorithm (CGFR/AG). The approaches presented in the paper consist of three steps: (1) Modification on standard back propagation algorithm by introducing gain variation term of th...

2004
Jiang Minghu Zhu

This paper presents the hybrid algorithm of global optimization of dynamic learning rate for multilayer feedforward neural networks (MLFNN). The effect of inexact line search on conjugacy was studied and a generalized conjugate gradient method based on this effect was proposed and shown to have global convergence for error backpagation of MLFNN. The descent property and global convergence was g...

2011
Matthieu Kowalski

This paper proposes an enhancement of the non linear conjugate gradient algorithm for some non-smooth problems. We first extend some results of descent algorithms in the smooth case for convex non-smooth functions. We then construct a conjugate descent algorithm based on the proximity operator to obtain a descent direction. We finally provide a convergence analysis of this algorithm, even when ...

1998
Yuhong Dai Jiye Han Guanghui Liu Defeng Sun Hongxia Yin

Recently, important contributions on convergence studies of conjugate gradient methods have been made by Gilbert and Nocedal 6]. They introduce a \suucient descent condition" to establish global convergence results, whereas this condition is not needed in the convergence analyses of Newton and quasi-Newton methods, 6] hints that the suucient descent condition, which was enforced by their two-st...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید