نتایج جستجو برای: backtracking armijo line search
تعداد نتایج: 694472 فیلتر نتایج به سال:
Abstract Monotonicity and nonmonotonicity play a key role in studying the global convergence and the efficiency of iterative schemes employed in the field of nonlinear optimization, where globally convergent and computationally efficient schemes are explored. This paper addresses some features of descent schemes and the motivation behind nonmonotone strategies and investigates the efficiency of...
We investigate the BFGS algorithm with an inexact line search when applied to nonsmooth functions, not necessarily convex. We define a suitable line search and show that it generates a sequence of nested intervals containing points satisfying the Armijo and weak Wolfe conditions, assuming only absolute continuity. We also prove that the line search terminates for all semi-algebraic functions. T...
This article focuses on gradient-based backpropagation algorithms that use either a common adaptive learning rate for all weights or an individual adaptive learning rate for each weight and apply the Goldstein/Armijo line search. The learning-rate adaptation is based on descent techniques and estimates of the local Lipschitz constant that are obtained without additional error function and gradi...
Two new nonlinear spectral conjugate gradient methods for solving unconstrained optimization problems are proposed. One is based on the Hestenes and Stiefel (HS) method and the spectral conjugate gradient method. The other is based on a mixed spectral HS-CD conjugate gradient method, which combines the advantages of the spectral conjugate gradient method, the HS method, and the CD method. The d...
This work is concerned with the classical problem of finding a zero of a sum of maximal monotone operators. For the projective splitting framework recently proposed by Combettes and Eckstein, we show how to replace the fundamental subproblem calculation using a backward step with one based on two forward steps. The resulting algorithms have the same kind of coordination procedure and can be imp...
Now a days, the major challenge in machine learning is the ‘Big Data’ challenge. The big data problems due to large number of data points or large number of features in each data point, or both, the training of models have become very slow. The training time has two major components: Time to access the data and time to process the data. In this paper, we have proposed one possible solution to h...
This paper studies convergence properties of regularized Newton methods for minimizing a convex function whose Hessian matrix may be singular everywhere. We show that if the objective function is LC2, then the methods possess local quadratic convergence under a local error bound condition without the requirement of isolated nonsingular solutions. By using a backtracking line search, we globaliz...
The fluence map optimization (FMO) problem is a core problem in intensity modulated radiation therapy (IMRT) treatment planning. Although it has been studied extensively for site-specific treatment planning, few studies have examined efficient computational methods for solving it for intensity modulated total marrow irradiation (IM-TMI) planning; few studies have also looked at exploiting prior...
This article may be used for research, teaching, and private study purposes. Any substantial or systematic reproduction, redistribution, reselling, loan, sub-licensing, systematic supply, or distribution in any form to anyone is expressly forbidden. The publisher does not give any warranty express or implied or make any representation that the contents will be complete or accurate or up to date...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید