The Performance Analysis of a New Modification of Conjugate Gradient Parameter for Unconstrained Optimization Models
نویسندگان
چکیده
Conjugate Gradient (CG) method is the most prominent iterative mathematical technique that can be useful for optimization of both linear and non-linear systems due to its simplicity, low memory requirement, computational cost, global convergence properties. However, some classical CG methods have drawbacks which include weak convergence, poor numerical performance in terms number iterations CPU time. To overcome these drawbacks, researchers proposed new variants parameters with efficient results nice Some scale method, hybrid spectral three-term many more. The conjugate gradient algorithm among variant class mentioned above. interesting features modifications inherenting properties existing methods. In this paper, we a inherits Rivaie et al. (RMIL*) Dai (RMIL+) generates descent direction under strong Wolfe line search conditions. Preliminary on benchmark problems show promising.
منابع مشابه
A new hybrid conjugate gradient algorithm for unconstrained optimization
In this paper, a new hybrid conjugate gradient algorithm is proposed for solving unconstrained optimization problems. This new method can generate sufficient descent directions unrelated to any line search. Moreover, the global convergence of the proposed method is proved under the Wolfe line search. Numerical experiments are also presented to show the efficiency of the proposed algorithm, espe...
متن کاملA New Hybrid Conjugate Gradient Method Based on Eigenvalue Analysis for Unconstrained Optimization Problems
In this paper, two extended three-term conjugate gradient methods based on the Liu-Storey ({tt LS}) conjugate gradient method are presented to solve unconstrained optimization problems. A remarkable property of the proposed methods is that the search direction always satisfies the sufficient descent condition independent of line search method, based on eigenvalue analysis. The globa...
متن کاملAn Efficient Conjugate Gradient Algorithm for Unconstrained Optimization Problems
In this paper, an efficient conjugate gradient method for unconstrained optimization is introduced. Parameters of the method are obtained by solving an optimization problem, and using a variant of the modified secant condition. The new conjugate gradient parameter benefits from function information as well as gradient information in each iteration. The proposed method has global convergence und...
متن کاملNew accelerated conjugate gradient algorithms as a modification of Dai-Yuan's computational scheme for unconstrained optimization
New accelerated nonlinear conjugate gradient algorithms which are mainly modifications of the Dai and Yuan’s for unconstrained optimization are proposed. Using the exact line search, the algorithm reduces to the Dai and Yuan conjugate gradient computational scheme. For inexact line search the algorithm satisfies the sufficient descent condition. Since the step lengths in conjugate gradient algo...
متن کاملA New Descent Nonlinear Conjugate Gradient Method for Unconstrained Optimization
In this paper, a new nonlinear conjugate gradient method is proposed for large-scale unconstrained optimization. The sufficient descent property holds without any line searches. We use some steplength technique which ensures the Zoutendijk condition to be held, this method is proved to be globally convergent. Finally, we improve it, and do further analysis.
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Mathematics and Statistics
سال: 2021
ISSN: ['2332-2144', '2332-2071']
DOI: https://doi.org/10.13189/ms.2021.090103