نتایج جستجو برای: sufficient descent condition
تعداد نتایج: 490700 فیلتر نتایج به سال:
A new conjugate gradient method is proposed for applying Powell's symmetrical technique to conjugate gradient methods in this paper, which satisfies the sufficient descent property for any line search. Using Wolfe line searches, the global convergence of the method is derived from the spectral analysis of the conjugate gradient iteration matrix and Zoutendijk's condition. Based on this, two con...
Cubic-regularization and trust-region methods with worst-case first-order complex4 ity O(ε−3/2) and worst-case second-order complexity O(ε−3) have been developed in the last few 5 years. In this paper it is proved that the same complexities are achieved by means of a quadratic6 regularization method with a cubic sufficient-descent condition instead of the more usual predicted7 reduction based d...
Cubic-regularization and trust-region methods with worst-case first-order complexity O(ε−3/2) and worst-case second-order complexity O(ε−3) have been developed in the last few years. In this paper it is proved that the same complexities are achieved by means of a quadratic-regularization method with a cubic sufficient-descent condition instead of the more usual predicted-reduction based descent...
The classic equation for decomposing the wavefront aberrations of axis-symmetrical optical systems has form, $$ W({h}_0,\rho,\phi )=\sum_{j=0}^{\propto } \sum_{p=0}^{\propto \sum_{m=0}^{\propto {C}_{\left(2j+m\right)\left(2p+m\right)m}({h}_0{)}^{2j+m}(\rho {)}^{2p+m}(\mathrm{cos}\phi {)}^m where j , p and m are non-negative integers, ρ ϕ polar coordinates pupil, h 0 is object height. However, o...
In this paper, we derive a search direction for the conjugate-gradient method based on use of self-scaling Quasi Newton-method, and usefulness new is to solve unconstrained optimization problems with large dimensions. order clarify importance proposed method, have shown its characteristics in terms sufficient descent condition theoretically global convergence condition. Numerically, applied var...
Conjugate gradient methods are among the most efficient for solving optimization models. In this paper, a newly proposed conjugate method is problems as convex combination of Harger-Zhan and Dai-Yaun nonlinear methods, which capable producing sufficient descent condition with global convergence properties under strong Wolfe conditions. The numerical results demonstrate efficiency some benchmark...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید