نتایج جستجو برای: conjugate gradient methods

تعداد نتایج: 2006107  

Journal: :Mathematical foundations of computing 2023

Conjugate gradient methods are among the most efficient for solving optimization models. In this paper, a newly proposed conjugate method is problems as convex combination of Harger-Zhan and Dai-Yaun nonlinear methods, which capable producing sufficient descent condition with global convergence properties under strong Wolfe conditions. The numerical results demonstrate efficiency some benchmark...

2017
Mohammad Emtiyaz Khan Wu Lin

Variational inference is computationally challenging in models that contain both conjugate and non-conjugate terms. Methods specifically designed for conjugate models, even though computationally efficient, find it difficult to deal with non-conjugate terms. On the other hand, stochastic-gradient methods can handle the nonconjugate terms but they usually ignore the conjugate structure of the mo...

Journal: :J. Computational Applied Mathematics 2010
Neculai Andrei

New accelerated nonlinear conjugate gradient algorithms which are mainly modifications of the Dai and Yuan’s for unconstrained optimization are proposed. Using the exact line search, the algorithm reduces to the Dai and Yuan conjugate gradient computational scheme. For inexact line search the algorithm satisfies the sufficient descent condition. Since the step lengths in conjugate gradient algo...

Journal: :computational methods for differential equations 0
reza khoshsiar ghaziani shahrekord university mojtaba fardi shahrekord university mehdi ghasemi shahrekord university

this study develops and analyzes preconditioned krylov subspace methods to solve linear systemsarising from discretization of the time-independent space-fractional models. first, we apply shifted grunwald formulas to obtain a stable finite difference approximation to fractional advection-diffusion equations. then, we employee two preconditioned iterative methods, namely, the preconditioned gene...

2017
ERIN C. CARSON

Algebraic solvers based on preconditioned Krylov subspace methods are among the most powerful tools for large scale numerical computations in applied mathematics, sciences, technology, as well as in emerging applications in social sciences. As the name suggests, Krylov subspace methods can be viewed as a sequence of projections onto nested subspaces of increasing dimension. They are therefore b...

Journal: :Numerical Lin. Alg. with Applic. 2003
Yu-Hong Dai José Mario Martínez Jin Yun Yuan

The search direction in unconstrained minimization algorithms for large scale problems is usually computed as an iterate of the (precondi-tioned) conjugate gradient method applied to the minimization of a local quadratic model. In line-search procedures this direction is required to satisfy an angle condition, that says that the angle between the negative gradient at the current point and the d...

2015
Sukanya M

2 [email protected] ________________________________________________________________________________ ABSTRACT We propose a new algorithm for removing a fast motion blur from an image. The term "iterative met hod" refers to a wide range of techniques which use successive approximations to obtain more accurate solutions .In this paper an attempt to solve systems of linear equations of the form ...

2013
Xiangge Li Ye Duan Jianlin Cheng Zhihai He

.. ................................................................................................................... ix Chapter 1. Introduction ..................................................................................................1 Chapter 2. Background ..................................................................................................6 2.1. Matrix Compu...

2003
Diederik R. Fokkema Gerard L.G. Sleijpen Henk A. Van der Vorst

The Conjugate Gradient Squared (CGS) is an iterative method for solving nonsymmetric linear systems of equations. However, during the iteration large residual norms may appear, which may lead to inaccurate approximate solutions or may even deteriorate the convergence rate. Instead of squaring the Bi-CG polynomial as in CGS, we propose to consider products of two nearby Bi-CG polynomials which l...

2015
Peter Sonneveld

Krylov subspace methods have been applied successfully to solve various problems in Numerical Linear Algebra. The Netherlands have been a pioneer country in the development of Krylov methods over the past years. Methods like the Conjugate Gradient Squared (CGS), Bi-Conjugate Gradient Stabilized (BiCGSTAB), Nested GMRES (GMRESR), and the Induced Dimension Reduction method (IDR) are examples of K...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید