نتایج جستجو برای: least mean squares lms algorithm

تعداد نتایج: 1627305  

2012
Amjad Khan

In many application of noise cancellation, the changes in signal characteristics could be quite fast. This requires the utilization of adaptive algorithms, which converge rapidly. Least Mean Squares (LMS) and Normalized Least Mean Squares (NLMS) adaptive filters have been used in a wide range of signal processing application because of its simplicity in computation and implementation. The Recur...

2014
Shazia Javed Noor Atinah Ahmad

An efficient and computationally linear algorithm is derived for total least squares solution of adaptive filtering problem, when both input and output signals are contaminated by noise. The proposed total least mean squares (TLMS) algorithm is designed by recursively computing an optimal solution of adaptive TLS problem by minimizing instantaneous value of weighted cost function. Convergence a...

1999
H C So

In the presence of input interference, the Wiener solution for impulse response estimation is biased. In this Letter, it is proved that bias removal can be achieved by proper scaling the optimal lter coeecients and a modiied least mean squares (LMS) algorithm is then developed for accurate system identiication in noise. Simulation results show that the proposed method outperforms two total leas...

2004
Min - Cheol Hong Tania Stathaki Aggelos K. Katsaggelos

In this paper; we propose an iterative mired norm image restoration algorithm. A functional which combines the least mean squares (LMS) and the least mean fourth (LMF) functionals is proposed. A function of the kurtosis: is used to determine the relative importance between the L A 8 and the LMF functionals. An iterative algorithm is utilized for obtaining a solution and its convergence is analy...

2001
Hyung-Min Park Sang-Hoon Oh Soo-Young Lee

Indexing terms: Adaptive signal processing, independent component analysis, adaptive noise cancelling A method for adaptive noise cancelling based on independent component analysis (ICA) is presented. Although conventional least-mean-squares (LMS) algorithm removes noise components based on second-order correlation, the proposed algorithm can utilize higher-order statistics. Experimental result...

Journal: :Neural networks : the official journal of the International Neural Network Society 2003
Bernard Widrow Max Kamenetsky

The statistical efficiency of a learning algorithm applied to the adaptation of a given set of variable weights is defined as the ratio of the quality of the converged solution to the amount of data used in training the weights. Statistical efficiency is computed by averaging over an ensemble of learning experiences. A high quality solution is very close to optimal, while a low quality solution...

1996
Constantine Kotropoulos Ioannis Pitas Maria Gabrani

Three novel adaptive multichannel L-filters based on marginal data ordering are proposed. They rely on well-known algorithms for the unconstrained minimization of the Mean Squared Error (MSE), namely, the Least Mean Squares (LMS), the normalized LMS (NLMS) and the LMS-Newton (LMSN) algorithm. Performance comparisons in color image filtering have been made both in RGB and U∗V ∗W ∗ color spaces. ...

2013
M. Deepika

Speech has always been one of the most important carriers of information for people it becomes a challenge to maintain its high quality. In many application of noise cancellation, the changes in signal characteristics could be quite fast. This requires the utilization of adaptive algorithms, which converge rapidly. Least Mean Squares (LMS) and Normalized Least Mean Squares (NLMS) adaptive filte...

2014
César Lincoln C. Mattos José Daniel A. Santos Guilherme De A. Barreto

The Adaline network [1] is a classic neural architecture whose learning rule is the famous least mean squares (LMS) algorithm (a.k.a. delta rule or Widrow-Hoff rule). It has been demonstrated that the LMS algorithm is optimal in H∞ sense since it tolerates small (in energy) disturbances, such as measurement noise, parameter drifting and modelling errors [2,3]. Such optimality of the LMS algorit...

2004
Thomas Kailath

We obtain upper and lower bounds for the H" norm of the RLS (Recursive-Least-Squares) algorithm. The H" norm may be regarded aa the worst-case energy gain from the disturbances to the prediction errors, and is therefore a measure of the robustness of an algorithm to perturbations and model uncertainty. Our results allow one to compare the robustness of RLS compared to the LMS (Least-Mean-Square...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید