نتایج جستجو برای: nonmonotone line search
تعداد نتایج: 693223 فیلتر نتایج به سال:
In this paper, a three-dimensional subspace method is proposed, in which the search direction generated by minimizing approximation model of objective function subspace. The not unique, and alternatives can be chosen between symmetric quadratic conic specific criteria. Moreover, idea WLY conjugate gradient applied to characterize change adjacent iteration points. strategy initial stepsize nonmo...
In this paper we study a class of derivative-free unconstrained minimization algorithms employing nonmonotone inexact linesearch techniques along a set of suitable search directions. In particular, we define globally convergent nonmonotone versions of some well-known derivativefree methods and we propose a new algorithm combining coordinate rotations with approximate simplex gradients. Through ...
In this paper we study nonmonotone globalization techniques, in connection with finite-difference inexact Newton-GMRES methods, for solving large-scale systems of nonlinear equations in the case that the Jacobian matrix is not available. We first define a globalization scheme, which combines nonmonotone watchdog rules and nonmonotone derivative-free line searches, and we prove its global conver...
We address composite optimization problems, which consist in minimizing the sum of a smooth and merely lower semicontinuous function, without any convexity assumptions. Numerical solutions these problems can be obtained by proximal gradient methods, often rely on line search procedure as globalization mechanism. consider an adaptive nonmonotone scheme based averaged merit function establish asy...
This paper analyzes and improves the linearized Bregman method for solving the basis pursuit and related sparse optimization problems. The analysis shows that the linearized Bregman method has the exact regularization property; namely, it converges to an exact solution of the basis pursuit problem whenever its smooth parameter α is greater than a certain value. The analysis is based on showing ...
Multivariate spectral gradient method is proposed for solving unconstrained optimization problems. Combined with some quasi-Newton property multivariate spectral gradient method allows an individual adaptive stepsize along each coordinate direction, which guarantees that the method is finitely convergent for positive definite quadratics. Especially, it converges no more than two steps for posit...
We develop an affine-scaling algorithm for box-constrained optimization which has the property that each iterate is a scaled cyclic Barzilai–Borwein (CBB) gradient iterate that lies in the interior of the feasible set. Global convergence is established for a nonmonotone line search, while there is local R-linear convergence at a nondegenerate local minimizer where the second-order sufficient op...
We consider the Spectral Projected Gradient method for solving constrained optimization porblems with the objective function in the form of mathematical expectation. It is assumed that the feasible set is convex, closed and easy to project on. The objective function is approximated by a sequence of Sample Average Approximation functions with different sample sizes. The sample size update is bas...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید