نتایج جستجو برای: descent method

تعداد نتایج: 1645212  

2014
Liu Jing

To solve a special class of variational inequalities with separable structure, this paper proposes a descent alternating direction method based on a new residual function. The most prominent characteristic of the method is that it is easily performed, in which, only some orthogonal projections and function evaluations are involved at each iteration, so its computational load is very tiny. Under...

2001
Mile Milisavljevic

In this paper, the methods for use of prior information about multiple operating environments, in improving adaptive filter convergence properties are discussed. More concretely, the gain selection, profiling and scheduling in steepest descent algorithms are treated in detail. Work presented in this paper is an extension of [1]. Two flavors of optimization are discussed: average descent rate op...

2009
Thord Andersson Gunnar Läthén Reiner Lenz Magnus Borga

Level set methods are a popular way to solve the image segmentation problem in computer image analysis. A contour is implicitly represented by the zero level of a signed distance function, and evolved according to a motion equation in order to minimize a cost function. This function defines the objective of the segmentation problem and also includes regularization constraints. Gradient descent ...

2015
Walid Krichene Alexandre M. Bayen Peter L. Bartlett

We study accelerated mirror descent dynamics in continuous and discrete time. Combining the original continuous-time motivation of mirror descent with a recent ODE interpretation of Nesterov’s accelerated method, we propose a family of continuous-time descent dynamics for convex functions with Lipschitz gradients, such that the solution trajectories converge to the optimum at a O(1/t2) rate. We...

2014
Frank E. Curtis Wei Guo

We propose a limited memory steepest descent method for solving unconstrained optimization problems. As a steepest descent method, the step computation in each iteration only requires the evaluation of a gradient of the objective function and the calculation of a scalar stepsize. When employed to solve certain convex problems, our method reduces to a variant of the limited memory steepest desce...

Journal: :categories and general algebraic structures with application 0
maurice kianpi laboratory of algebra, geometry and applications, department of mathematics, faculty of science, university of yaounde 1, p.o. box 812, yaounde, republic of cameroon.

we find a criterion for a morphism of coalgebras over a barr-exact category to be effective descent and determine (effective) descent morphisms for coalgebras over toposes in some cases. also, we study some exactness properties of endofunctors of arbitrary categories in connection with natural transformations between them as well as those of functors that these transformations induce between co...

2012
Thanh T. Ngo Yousef Saad

This paper describes gradient methods based on a scaled metric on the Grassmann manifold for low-rank matrix completion. The proposed methods significantly improve canonical gradient methods, especially on ill-conditioned matrices, while maintaining established global convegence and exact recovery guarantees. A connection between a form of subspace iteration for matrix completion and the scaled...

Journal: :SIAM Journal on Optimization 2015
Cong D. Dang Guanghui Lan

In this paper, we present a new stochastic algorithm, namely the stochastic block mirror descent (SBMD) method for solving large-scale nonsmooth and stochastic optimization problems. The basic idea of this algorithm is to incorporate the block-coordinate decomposition and an incremental block averaging scheme into the classic (stochastic) mirror-descent method, in order to significantly reduce ...

2000
M. Onder Efe Okyay Kaynak

This paper presents a method for stabilizing and robustifying the artificial neural networks trained by utilizing the gradient descent. The method proposed constructs a dynamic model of the conventional update mechanism and derives the stabilizing values of the learning rate. The stability in this context corresponds to the convergence in adjustable parameters of the neural network structure. I...

Journal: :SIAM Journal on Optimization 2013
William W. Hager Hongchao Zhang

In theory, the successive gradients generated by the conjugate gradient method applied to a quadratic should be orthogonal. However, for some ill-conditioned problems, orthogonality is quickly lost due to rounding errors, and convergence is much slower than expected. A limited memory version of the nonlinear conjugate gradient method is developed. The memory is used to both detect the loss of o...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید