نتایج جستجو برای: sufficient descent directions

تعداد نتایج: 286567  

2015
Xiaoli Sun

In this paper, a new spectral PRP conjugate gradient method is proposed, which can always generate sufficient descent direction without any line search. Under some standard conditions, global convergence of the proposed method is established when the standard Armijo or weak Wolfe line search is used. Moreover, we extend these results to the HS method. Numerical comparisons are reported with som...

Journal: :Appl. Math. Lett. 2008
Neculai Andrei

A modification of the Dai-Yuan conjugate gradient algorithm is proposed. Using the exact line search, the algorithm reduces to the original version of the Dai and Yuan computational scheme. For inexact line search the algorithm satisfies both the sufficient descent and conjugacy condition. A global convergence result is proved when the Wolfe line search conditions are used. Computational result...

2008
R.Dennis Cook Liliana Forzani

We obtain the maximum likelihood estimator of the central subspace under conditional normality of the predictors given the response. Analytically and in simulations we found that our new estimator can preform much better than sliced inverse regression, sliced average variance estimation and directional regression, and that it seems quite robust to deviations from normality.

2012
Zhenghui Feng Xuerong Meggie Wen Zhou Yu Lixing Zhu Zhenghui FENG Xuerong MEGGIE WEN Zhou YU Lixing ZHU

Partial dimension reduction is a general method to seek informative convex combinations of predictors of primary interest, which includes dimension reduction as its special case when the predictors in the remaining part are constants. In this paper, we propose a novel method to conduct partial dimension reduction estimation for predictors of primary interest without assuming that the remaining ...

Journal: :Algorithms 2021

The conjugate gradient method is one of the most popular methods to solve large-scale unconstrained optimization problems since it does not require second derivative, such as Newton’s or approximations. Moreover, can be applied in many fields neural networks, image restoration, etc. Many complicated are proposed these functions two three terms. In this paper, we propose a simple, easy, efficien...

Journal: :categories and general algebraic structures with application 0
maurice kianpi laboratory of algebra, geometry and applications, department of mathematics, faculty of science, university of yaounde 1, p.o. box 812, yaounde, republic of cameroon.

we find a criterion for a morphism of coalgebras over a barr-exact category to be effective descent and determine (effective) descent morphisms for coalgebras over toposes in some cases. also, we study some exactness properties of endofunctors of arbitrary categories in connection with natural transformations between them as well as those of functors that these transformations induce between co...

Journal: :CoRR 2018
Fanhua Shang Yuanyuan Liu Kaiwen Zhou James Cheng Kelvin K. W. Ng Yuichi Yoshida

In this paper, we propose a novel sufficient decrease technique for stochastic variance reduced gradient descent methods such as SVRG and SAGA. In order to make sufficient decrease for stochastic optimization, we design a new sufficient decrease criterion, which yields sufficient decrease versions of stochastic variance reduction algorithms such as SVRG-SD and SAGA-SD as a byproduct. We introdu...

Journal: :Mathematical foundations of computing 2023

Conjugate gradient methods are among the most efficient for solving optimization models. In this paper, a newly proposed conjugate method is problems as convex combination of Harger-Zhan and Dai-Yaun nonlinear methods, which capable producing sufficient descent condition with global convergence properties under strong Wolfe conditions. The numerical results demonstrate efficiency some benchmark...

2016
Hui Zhang

The subject of linear convergence of gradient-type methods on non-strongly convex optimization has been widely studied by introducing several notions as sufficient conditions. Influential examples include the error bound property, the restricted strongly convex property, the quadratic growth property, and the KurdykaLojasiewicz property. In this paper, we first define a group of error bound con...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید