نتایج جستجو برای: conjugate gradient methods

تعداد نتایج: 2006107  

Journal: :IEEE transactions on image processing : a publication of the IEEE Signal Processing Society 1999
Jeffrey A. Fessler Scott D. Booth

Gradient-based iterative methods often converge slowly for tomographic image reconstruction and image restoration problems, but can be accelerated by suitable preconditioners. Diagonal preconditioners offer some improvement in convergence rate, but do not incorporate the structure of the Hessian matrices in imaging problems. Circulant preconditioners can provide remarkable acceleration for inve...

Journal: :Math. Meth. of OR 2017
Mehiddin Al-Baali Caliciotti Andrea Giovanni Fasano Massimo Roma

In this paper we propose the use of damped techniques within Nonlinear Conjugate Gradient (NCG) methods. Damped techniques were introduced by Powell and recently reproposed by Al-Baali and till now, only applied in the framework of quasi–Newton methods. We extend their use to NCG methods in large scale unconstrained optimization, aiming at possibly improving the efficiency and the robustness of...

Journal: :J. Comput. Physics 2009
Jianke Yang

In this paper, the Newton-conjugate-gradient methods are developed for solitary wave computations. These methods are based on Newton iterations, coupled with conjugategradient iterations to solve the resulting linear Newton-correction equation. When the linearization operator is self-adjoint, the preconditioned conjugate-gradient method is proposed to solve this linear equation. If the lineariz...

Journal: :Math. Comput. 2003
Yu-Hong Dai

Conjugate gradient methods are an important class of methods for unconstrained optimization, especially for large-scale problems. Recently, they have been much studied. This paper proposes a three-parameter family of hybrid conjugate gradient methods. Two important features of the family are that (i) it can avoid the propensity of small steps, namely, if a small step is generated away from the ...

2014
A EDELMAN S T SMITH

Numerical analysts physicists and signal processing engineers have proposed algo rithms that might be called conjugate gradient for problems associated with the com putation of eigenvalues There are many variations mostly one eigenvalue at a time though sometimes block algorithms are proposed Is there a correct conjugate gradi ent algorithm for the eigenvalue problem How are the algorithms rela...

Journal: :Parallel Computing 1990
Luigi Brugnano M. Marrone

The block preconditioned conjugate gradient methods are very effective to solve the linear systems arising from the discretization of elliptic PDE. Nevertheless, the solution of the linear system Ms = r, to get the preconditioned residual, is a 'bottleneck', on vector processors. In this paper, we show how to modify the algorithm, in order to get better performances, on such computers. Numerica...

Journal: :Comp. Opt. and Appl. 2016
Massimo Fornasier Steffen Peter Holger Rauhut Stephan Worm

Iteratively Re-weighted Least Squares (IRLS) is a method for solving minimization problems involving non-quadratic cost functions, perhaps non-convex and non-smooth, which however can be described as the infimum over a family of quadratic functions. This transformation suggests an algorithmic scheme that solves a sequence of quadratic problems to be tackled efficiently by tools of numerical lin...

2011
Neculai Andrei N. Andrei

The paper presents some open problems associated to the nonlinear conjugate gradient algorithms for unconstrained optimization. Mainly, these problems refer to the initial direction, the conjugacy condition, the step length computation, new formula for conjugate gradient parameter computation based on function’s values, the influence of accuracy of line search procedure, how we can take the pro...

Journal: :SIAM Journal on Optimization 2013
Yu-Hong Dai Cai-Xia Kou

In this paper, we seek the conjugate gradient direction closest to the direction of the scaled memoryless BFGS method and propose a family of conjugate gradient methods for unconstrained optimization. An improved Wolfe line search is also proposed, which can avoid a numerical drawback of the Wolfe line search and guarantee the global convergence of the conjugate gradient method under mild condi...

Journal: :J. Computational Applied Mathematics 2010
Saman Babaie-Kafaki Reza Ghanbari Nezam Mahdavi-Amiri

Following the approach proposed by Dai and Liao, we introduce two nonlinear conjugate gradient methods for unconstrained optimization problems. One of our proposedmethods is based on a modified version of the secant equation proposed by Zhang, Deng and Chen, and Zhang and Xu, and the other is based on the modified BFGS update proposed by Yuan. An interesting feature of our methods is their acco...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید