On the regularization of forgetting recursive least square
نویسندگان
چکیده
In this paper, the regularization of employing the forgetting recursive least square (FRLS) training technique on feedforward neural networks is studied. We derive our result from the corresponding equations for the expected prediction error and the expected training error. By comparing these error equations with other equations obtained previously from the weight decay method, we have found that the FRLS technique has an effect which is identical to that of using the simple weight decay method. This new finding suggests that the FRLS technique is another on-line approach for the realization of the weight decay effect. Besides, we have shown that, under certain conditions, both the model complexity and the expected prediction error of the model being trained by the FRLS technique are better than the one trained by the standard RLS method.
منابع مشابه
Dynamically Regularized Fast Recursive Least Squares
This paper introduces a dynamically regularized fast recursive least squares (DR-FRLS) adaptive filtering algorithm. Numerically stabilized FRLS algorithms exhibit reliable and fast convergence with low complexity even when the excitation signal is highly self-correlated. FRLS still suffers from instability, however, when the condition number of the implicit excitation sample covariance matrix ...
متن کاملA Novel Forgetting Factor Recursive Least Square Algorithm Applied to the Human Motion Analysis
This paper is concerned with studying the forgetting factor of the recursive least square (RLS). A new dynamic forgetting factor (DFF) for RLS algorithm is presented. The proposed DFF-RLS is compared to other methods. Better performance at convergence and tracking of noisy chirp sinusoid is achieved. The control of the forgetting factor at DFF-RLS is based on the gradient of inverse correlation...
متن کاملLow Complexity and High speed in Leading DCD ERLS Algorithm
Adaptive algorithms lead to adjust the system coefficients based on the measured data. This paper presents a dichotomous coordinate descent method to reduce the computational complexity and to improve the tracking ability based on the variable forgetting factor when there are a lot of changes in the system. Vedic mathematics is used to implement the multiplier and the divider in the VFF equatio...
متن کاملVariable forgetting factor mechanisms for diffusion recursive least squares algorithm in sensor networks
In this work, we present low-complexity variable forgetting factor (VFF) techniques for diffusion recursive least squares (DRLS) algorithms. Particularly, we propose low-complexity VFF-DRLS algorithms for distributed parameter and spectrum estimation in sensor networks. For the proposed algorithms, they can adjust the forgetting factor automatically according to the posteriori error signal. We ...
متن کاملGradient based variable forgetting factor RLS algorithm
The recursive least squares (RLS) algorithm is well known for its good convergence property and small mean square error in stationary environments. However RLS using constant forgetting factor cannot provide satisfactory performance in time varying environments. In this seminar, three variable forgetting factor (VFF) adaptation schemes for RLS are presented in order to improve the tracking perf...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- IEEE transactions on neural networks
دوره 10 6 شماره
صفحات -
تاریخ انتشار 1999