نتایج جستجو برای: conjugate gradient method

تعداد نتایج: 1754596  

2017
Yuanyuan Huang Changhe Liu

In this paper, the Dai-Kou type conjugate gradient methods are developed to solve the optimality condition of an unconstrained optimization, they only utilize gradient information and have broader application scope. Under suitable conditions, the developed methods are globally convergent. Numerical tests and comparisons with the PRP+ conjugate gradient method only using gradient show that the m...

Hamed Memarian fard,

The use of artificial neural networks has increased in many areas of engineering. In particular, this method has been applied to many geotechnical engineering problems and demonstrated some degree of success. A review of the literature reveals that it has been used successfully in modeling soil behavior, site characterization, earth retaining structures, settlement of structures, slope stabilit...

Journal: :J. Applied Mathematics 2012
Jin-kui Liu Xianglin Du Kairong Wang

Journal: :J. Comput. Physics 2009
Jianke Yang

In this paper, the Newton-conjugate-gradient methods are developed for solitary wave computations. These methods are based on Newton iterations, coupled with conjugategradient iterations to solve the resulting linear Newton-correction equation. When the linearization operator is self-adjoint, the preconditioned conjugate-gradient method is proposed to solve this linear equation. If the lineariz...

Journal: :SIAM Journal on Optimization 1999
Yu-Hong Dai Ya-Xiang Yuan

Conjugate gradient methods are widely used for unconstrained optimization, especially large scale problems. However, the strong Wolfe conditions are usually used in the analyses and implementations of conjugate gradient methods. This paper presents a new version of the conjugate gradient method, which converges globally provided the line search satisses the standard Wolfe conditions. The condit...

2008
Neculai Andrei

Conjugate gradient algorithms are very powerful methods for solving large-scale unconstrained optimization problems characterized by low memory requirements and strong local and global convergence properties. Over 25 variants of different conjugate gradient methods are known. In this paper we propose a fundamentally different method, in which the well known parameter k β is computed by an appro...

H. Attari S.H. Nasseri,

In this paper, Chebyshev acceleration technique is used to solve the fuzzy linear system (FLS). This method is discussed in details and followed by summary of some other acceleration techniques. Moreover, we show that in some situations that the methods such as Jacobi, Gauss-Sidel, SOR and conjugate gradient is divergent, our proposed method is applicable and the acquired results are illustrate...

2005
WILLIAM W. HAGER HONGCHAO ZHANG

This paper reviews the development of different versions of nonlinear conjugate gradient methods, with special attention given to global convergence properties.

Journal: :SIAM Journal on Optimization 2013
Yu-Hong Dai Cai-Xia Kou

In this paper, we seek the conjugate gradient direction closest to the direction of the scaled memoryless BFGS method and propose a family of conjugate gradient methods for unconstrained optimization. An improved Wolfe line search is also proposed, which can avoid a numerical drawback of the Wolfe line search and guarantee the global convergence of the conjugate gradient method under mild condi...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید