نتایج جستجو برای: sufficient descent directions
تعداد نتایج: 286567 فیلتر نتایج به سال:
A new conjugate gradient method is proposed for applying Powell's symmetrical technique to conjugate gradient methods in this paper, which satisfies the sufficient descent property for any line search. Using Wolfe line searches, the global convergence of the method is derived from the spectral analysis of the conjugate gradient iteration matrix and Zoutendijk's condition. Based on this, two con...
In this paper the effect of different aircraft automated descent guidance strategies on fuel burn and the temporal predictability of the executed trajectory is investigated. The paper aims to provide an understanding of how airborne automation can be permitted by Air Traffic Control to remain in control of the descent in the presence of disturbances while providing sufficient predictability. Si...
Cubic-regularization and trust-region methods with worst-case first-order complex4 ity O(ε−3/2) and worst-case second-order complexity O(ε−3) have been developed in the last few 5 years. In this paper it is proved that the same complexities are achieved by means of a quadratic6 regularization method with a cubic sufficient-descent condition instead of the more usual predicted7 reduction based d...
In the first part of the tutorial, we introduced the problem of unconstrained optimization, provided necessary and sufficient conditions for optimality of a solution to this problem, and described the gradient descent method for finding a (locally) optimal solution to a given unconstrained optimization problem. We now describe another method for unconstrained optimization, namely Newton’s metho...
A. A new nonlinear conjugate gradient method, based on Perry’s idea, is presented. And it is shown that its sufficient descent property is independent of any line search and the eigenvalues of Pk+1Pk+1 are bounded above, where Pk+1 is the iteration matrix of the new method. Thus, the global convergence is proven by the spectral analysis for nonconvex functions when the line search fulfil...
Cubic-regularization and trust-region methods with worst-case first-order complexity O(ε−3/2) and worst-case second-order complexity O(ε−3) have been developed in the last few years. In this paper it is proved that the same complexities are achieved by means of a quadratic-regularization method with a cubic sufficient-descent condition instead of the more usual predicted-reduction based descent...
An algorithm is proposed for performing linear discriminant analysis using a single-layered feedforward network. The algorithm follows successive steepest descent directions with respect to the perceptron cost function, taking care not to increase the number of misclassified patterns. The algorithm has no free parameters and therefore no heuristics are involved in its application. Its efficienc...
In this work we will give explicit formulae for the application of Rosen’s gradient projection method to SVM training that leads to a very simple implementation. We shall experimentally show that the method provides good descent directions that result in less training iterations, particularly when large precision is wanted. However, a naive kernelization may end up in a procedure requiring more...
Figure 2. The trajectories of CM, NAG, and SGD are shown. Although the value of the momentum is identical for both experiments, CM exhibits oscillations along the high-curvature directions, while NAG exhibits no such oscillations. The global minimizer of the objective is at (0,0). The red curve shows gradient descent with the same learning rate as NAG and CM, the blue curve shows NAG, and the g...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید