نتایج جستجو برای: variable stepsize implementation

تعداد نتایج: 612759  

Journal: :Journal of Computational and Applied Mathematics 2021

In this paper we study the efficiency of Strong Stability Preserving (SSP) Runge–Kutta methods that can be implemented with a low number registers using their Shu–Osher representation. SSP have been studied in literature and stepsize restrictions ensure numerical monotonicity found. However, for some problems, observed are larger than theoretical ones. Aiming at obtaining additional properties ...

Journal: :Rairo-operations Research 2022

It is widely accepted that the stepsize of great significance to gradient method. An efficient method with approximately optimal stepsizes mainly based on regularization models proposed for unconstrained optimization. More specifically, if objective function not close a quadratic line segment between current and latest iterates, model exploited carefully generate stepsize. Otherwise, approximat...

Journal: :EURASIP J. Adv. Sig. Proc. 2011
Mohammad Shams Esfand Abadi Seyed Ali Asghar AbbasZadeh Arani

This paper extends the recently introduced variable step-size (VSS) approach to the family of adaptive filter algorithms. This method uses prior knowledge of the channel impulse response statistic. Accordingly, optimal stepsize vector is obtained by minimizing the mean-square deviation (MSD). The presented algorithms are the VSS affine projection algorithm (VSS-APA), the VSS selective partial u...

Journal: :CoRR 2017
Huizhen Yu

We consider off-policy temporal-difference (TD) learning methods for policy evaluation in Markov decision processes with finite spaces and discounted reward criteria, and we present a collection of convergence results for several gradient-based TD algorithms with linear function approximation. The algorithms we analyze include: (i) two basic forms of two-time-scale gradient-based TD algorithms,...

Journal: :Computational Optimization and Applications 2022

The Barzilai–Borwein (BB) gradient method is efficient for solving large-scale unconstrained problems to modest accuracy due its ingenious stepsize which generally yields nonmonotone behavior. In this paper, we propose a new accelerate the BB by requiring finite termination minimizing two-dimensional strongly convex quadratic function. Based on stepsize, develop an optimization adaptively takes...

2009
Dilip Mali

It is well known that DC offset degrades the performance of analog adaptive filters. The effects of DC offset on LMS derivatives such as sign-data LMS, sign-error LMS and sign-sign LMS have been studied to much extent but that on MLMS, VSSLMS and NLMS algorithms have remained relatively ignored. The present paper reports the effects of dc offset on LMS algorithm and its four variations Sign LMS...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید