نتایج جستجو برای: lms algorithm

تعداد نتایج: 757726  

Journal: :Digital Signal Processing 1992
John F. Doherty Richard J. Mammone

Regression models are used in many areas of signal processing, e.g., spectral analysis and speech LPC, where block processing methods have typically been used to estimate the unknown coefficients. Iterative methods for adaptive estimation fall into two categories: the least-mean-square (LMS) algorithm and the recursive-least-squares (RLS) algorithm. The LMS algorithm offers low complexity and s...

1998
Randolph H. Cabell William Baumann Ricardo Burdisso Richard Silcox Alfred Wicks

A principal component least mean square (PC-LMS) adaptive algorithm is described that has considerable benefits for large control systems used to implement feedforward control of single frequency disturbances. The algorithm is a transform domain version of the multichannel filtered-x LMS algorithm. The transformation corresponds to the principal components (PCs) of the transfer function matrix ...

Journal: :IEEE Trans. Signal Processing 1999
Sudhakar Kalluri Gonzalo R. Arce

The normalized least mean square (NLMS) algorithm is an important variant of the classical LMS algorithm for adaptive linear filtering. It possesses many advantages over the LMS algorithm, including having a faster convergence and providing for an automatic time-varying choice of the LMS stepsize parameter that affects the stability, steady-state mean square error (MSE), and convergence speed o...

Journal: :CoRR 2016
Bijit Kumar Das Mrityunjoy Chakraborty

The sparsity-aware zero attractor least mean square (ZA-LMS) algorithm manifests much lower misadjustment in strongly sparse environment than its sparsity-agnostic counterpart, the least mean square (LMS), but is shown to perform worse than the LMS when sparsity of the impulse response decreases. The reweighted variant of the ZA-LMS, namely RZALMS shows robustness against this variation in spar...

2014
César Lincoln C. Mattos José Daniel A. Santos Guilherme De A. Barreto

The Adaline network [1] is a classic neural architecture whose learning rule is the famous least mean squares (LMS) algorithm (a.k.a. delta rule or Widrow-Hoff rule). It has been demonstrated that the LMS algorithm is optimal in H∞ sense since it tolerates small (in energy) disturbances, such as measurement noise, parameter drifting and modelling errors [2,3]. Such optimality of the LMS algorit...

2007
Sudhakar Kalluri Gonzalo R. Arce

The Normalized Least Mean Square (NLMS) algorithm is an important variant of the classical LMS algorithm for adaptive linear ltering. It possesses many advantages over the LMS algorithm, including having a faster convergence and providing for an automatic time-varying choice of the LMS step-size parameter which aaects the stability , steady-state mean square error (MSE) and convergence speed of...

2004
Panagiotis P. Mavridis

A new adaptive estimation algorithm is presented. It is the result of a combination of the LMS and the fast Newton transversal filters (FNTF) class. The main characteristic of the proposed algorithm is its improved convergence rate as compared to LMS, for cases where it is known that LMS behaves poorly. This improved characteristic is achieved in expense of a slight increase in the computationa...

1997
Shivaling S. Mahant-Shetti Srinath Hosur Alan Gatherer

This paper describes a new variant of the least-mean-squares (LMS) algorithm, with low computational complexity, for updating an adaptive lter. The reduction in complexity is obtained by using values of the input data and the output error, quantized to the nearest power of two, to compute the gradient. This eliminates the need for multipliers or shifters in the algorithm's update section. The q...

Journal: :EURASIP J. Adv. Sig. Proc. 2012
Azzedine Zerguine

Since both the least mean-square (LMS) and least mean-fourth (LMF) algorithms suffer individually from the problem of eigenvalue spread, so will the mixed-norm LMS-LMF algorithm. Therefore, to overcome this problem for the mixed-norm LMS-LMF, we are adopting here the same technique of normalization (normalizing with the power of the input) that was successfully used with the LMS and LMF separat...

2007
F. J. A. de AQUINO

In this paper, we present the multi-split version of the widely linear LMS algorithm. As in conventional linear filtering, the multi-split transform increases the diagonalization factor of the composed autocorrelation and pseudoautocorrelation matrix of the improper input signal, and a power normalized and time-varying step-size LMS algorithm is used for updating the filter parameters. Simulati...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید