نتایج جستجو برای: sufficient descent directions
تعداد نتایج: 286567 فیلتر نتایج به سال:
In this paper, a new spectral PRP conjugate gradient method is proposed, which can always generate sufficient descent direction without any line search. Under some standard conditions, global convergence of the proposed method is established when the standard Armijo or weak Wolfe line search is used. Moreover, we extend these results to the HS method. Numerical comparisons are reported with som...
A modification of the Dai-Yuan conjugate gradient algorithm is proposed. Using the exact line search, the algorithm reduces to the original version of the Dai and Yuan computational scheme. For inexact line search the algorithm satisfies both the sufficient descent and conjugacy condition. A global convergence result is proved when the Wolfe line search conditions are used. Computational result...
We obtain the maximum likelihood estimator of the central subspace under conditional normality of the predictors given the response. Analytically and in simulations we found that our new estimator can preform much better than sliced inverse regression, sliced average variance estimation and directional regression, and that it seems quite robust to deviations from normality.
Partial dimension reduction is a general method to seek informative convex combinations of predictors of primary interest, which includes dimension reduction as its special case when the predictors in the remaining part are constants. In this paper, we propose a novel method to conduct partial dimension reduction estimation for predictors of primary interest without assuming that the remaining ...
The conjugate gradient method is one of the most popular methods to solve large-scale unconstrained optimization problems since it does not require second derivative, such as Newton’s or approximations. Moreover, can be applied in many fields neural networks, image restoration, etc. Many complicated are proposed these functions two three terms. In this paper, we propose a simple, easy, efficien...
we find a criterion for a morphism of coalgebras over a barr-exact category to be effective descent and determine (effective) descent morphisms for coalgebras over toposes in some cases. also, we study some exactness properties of endofunctors of arbitrary categories in connection with natural transformations between them as well as those of functors that these transformations induce between co...
In this paper, we propose a novel sufficient decrease technique for stochastic variance reduced gradient descent methods such as SVRG and SAGA. In order to make sufficient decrease for stochastic optimization, we design a new sufficient decrease criterion, which yields sufficient decrease versions of stochastic variance reduction algorithms such as SVRG-SD and SAGA-SD as a byproduct. We introdu...
Conjugate gradient methods are among the most efficient for solving optimization models. In this paper, a newly proposed conjugate method is problems as convex combination of Harger-Zhan and Dai-Yaun nonlinear methods, which capable producing sufficient descent condition with global convergence properties under strong Wolfe conditions. The numerical results demonstrate efficiency some benchmark...
The subject of linear convergence of gradient-type methods on non-strongly convex optimization has been widely studied by introducing several notions as sufficient conditions. Influential examples include the error bound property, the restricted strongly convex property, the quadratic growth property, and the KurdykaLojasiewicz property. In this paper, we first define a group of error bound con...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید