نتایج جستجو برای: descent method

تعداد نتایج: 1645212  

Journal: :CoRR 2013
Katya Scheinberg Xiaocheng Tang

Recently several, so-called, proximal Newton methods were proposed for sparse optimization [6, 11, 8, 3]. These methods construct a composite quadratic approximation using Hessian information, optimize this approximation using a first-order method, such as coordinate descent and employ a line search to ensure sufficient descent. Here we propose a general framework, which includes slightly modif...

Journal: :SIAM Journal on Optimization 2010
Coralia Cartis Nicholas I. M. Gould Philippe L. Toint

It is shown that the steepest descent and Newton’s method for unconstrained nonconvex optimization under standard assumptions may be both require a number of iterations and function evaluations arbitrarily close to O(ǫ) to drive the norm of the gradient below ǫ. This shows that the upper bound of O(ǫ) evaluations known for the steepest descent is tight, and that Newton’s method may be as slow a...

Journal: :journal of advances in computer research 2012
ahmad jafarian safa measoomy nia raheleh jafari

artificial neural networks have the advantages such as learning, adaptation, fault-tolerance, parallelism and generalization. this paper mainly intends to offer a novel method for finding a solution of a fuzzy equation that supposedly has a real solution. for this scope, we applied an architecture of fuzzy neural networks such that the corresponding connection weights are real numbers. the sugg...

Journal: :Adv. Comput. Math. 2016
Philipp Grohs Seyedehsomayeh Hosseini

This paper presents a descent direction method for finding extrema of locally Lipschitz functions defined on Riemannian manifolds. To this end we define a set-valued mapping x → ∂εf(x) named ε-subdifferential which is an approximation for the Clarke subdifferential and which generalizes the Goldstein-ε-subdifferential to the Riemannian setting. Using this notion we construct a steepest descent ...

Journal: :Math. Program. 1993
Jia Hao Wu Michael Florian Patrice Marcotte

We present a framework for descent algorithms that solve the monotone variational inequality problem V IP v which consists in nding a solution v 2 v which satisses s(v) T (u?v) 0, for all u 2 v. This uniied framework includes, as special cases, some well known iterative methods and equivalent optimization formulations. A descent method is developed for an equivalent general optimization formula...

Let $X$ be a reflexive Banach space, $T:Xto X$ be a nonexpansive mapping with $C=Fix(T)neqemptyset$ and $F:Xto X$ be $delta$-strongly accretive and $lambda$- strictly pseudocotractive with $delta+lambda>1$. In this paper, we present modified hybrid steepest-descent methods, involving sequential errors and functional errors with functions admitting a center, which generate convergent sequences ...

2010
Carlos CASTRO Enrique ZUAZUA

We consider the problem of flux identification for 1-d scalar conservation laws formulating it as an optimal control problem. We introduce a new optimization strategy to compute numerical approximations of minimizing fluxes. We first prove the existence of minimizers. We also prove the convergence of discrete minima obtained by means of monotone numerical approximation schemes, by a Γ-convergen...

2017
Ahmet Alacaoglu Quoc Tran-Dinh Olivier Fercoq Volkan Cevher

We propose a new randomized coordinate descent method for a convex optimization template with broad applications. Our analysis relies on a novel combination of four ideas applied to the primal-dual gap function: smoothing, acceleration, homotopy, and coordinate descent with non-uniform sampling. As a result, our method features the first convergence rate guarantees among the coordinate descent ...

Journal: :Math. Comput. 2011
Carlos Castro Enrique Zuazua

We consider the problem of flux identification for 1-d scalar conservation laws formulating it as an optimal control problem. We introduce a new optimization strategy to compute numerical approximations of minimizing fluxes. We first prove the existence of minimizers. We also prove the convergence of discrete minima obtained by means of monotone numerical approximation schemes, by a Γ-convergen...

2016
Fatimazahra Benssi Abdellah Bnouhachem Ali Ou-yassine Muhammad Aslam Noor

In this paper, we suggest and analyze a new iterative method for solving mixed quasi variational inequalities. The new iteration is obtained by searching the optimal step size along the integrated descent direction from two descent directions. Global convergence of the proposal method is proved under certain assumptions. Our results can be treated as refinement of previously known results. An e...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید