نتایج جستجو برای: hybrid steepest descent method

تعداد نتایج: 1803458  

1996
Dimitri P. Bertsekas

The LMS method for linear least squares problems differs from the steepest descent method in that it processes data blocks one-by-one, with intermediate adjustment of the parameter vector under optimization. This mode of operation often leads to faster convergence when far from the eventual limit, and to slower (sublinear) convergence when close to the optimal solution. We embed both LMS and st...

Journal: :Journal of Physics: Conference Series 2019

2017
Peiyuan Wang Jianjun Zhou Risheng Wang Jie Chen

In this paper, we present several explicit and hybrid strong convergence algorithms for solving the multiple-sets split feasibility problem (MSSFP). Firstly, we modify the existing successive, parallel and cyclic algorithms with the hybrid steepest descent method; then two new hybrid formulas based on the Mann type method are presented; Two general hybrid algorithms which can cover the former o...

Journal: :Automatica 2009
Mathieu Gerard Bart De Schutter Michel Verhaegen

This paper describes a hybrid steepest descent method to decrease over time any given convex cost function while keeping the optimization variables into any given convex set. The method takes advantage of properties of hybrid systems to avoid the computation of projections or of a dual optimum. The convergence to a global optimum is analyzed using Lyapunov stability arguments. A discretized imp...

2002
Tatsuya KOIKE Yoshitsugu TAKEI

The exact steepest descent method was born in [AKT4] by combining the ordinary steepest descent method with the exact WKB analysis. (See, e.g., [AKT2] for the notion and notations of the exact WKB analysis used in this report.) It is a straightforward generalization of the ordinary steepest descent method and provides us with a new powerful tool for the description of Stokes curves as well as f...

Let $X$ be a reflexive Banach space, $T:Xto X$ be a nonexpansive mapping with $C=Fix(T)neqemptyset$ and $F:Xto X$ be $delta$-strongly accretive and $lambda$- strictly pseudocotractive with $delta+lambda>1$. In this paper, we present modified hybrid steepest-descent methods, involving sequential errors and functional errors with functions admitting a center, which generate convergent sequences ...

Journal: :CoRR 2008
Hui Huang Uri M. Ascher

Much recent attention has been devoted to gradient descent algorithms where the steepest descent step size is replaced by a similar one from a previous iteration or gets updated only once every second step, thus forming a faster gradient descent method. For unconstrained convex quadratic optimization these methods can converge much faster than steepest descent. But the context of interest here ...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید