نتایج جستجو برای: mean squares error
تعداد نتایج: 833068 فیلتر نتایج به سال:
Statistical literature has several methods for coping with multicollinearity. This paper introduces a new shrinkage estimator, called modified unbiased ridge (MUR). This estimator is obtained from unbiased ridge regression (URR) in the same way that ordinary ridge regression (ORR) is obtained from ordinary least squares (OLS). Properties of MUR are derived. Results on its matrix mean squared er...
Widrow proposed the least mean squares (LMS) algorithm, which has been extensively applied in adaptive signal processing and adaptive control. The LMS algorithm is based on the minimum mean squares error. On the basis of the total least mean squares error or the minimum Raleigh quotient, we propose the total least mean squares (TLMS) algorithm. The paper gives the statistical analysis for this ...
In this paper, the effect of misspecification due to omission of relevant variables on the dominance of the r −(k, d) class estimator proposed by Özkale (2012), over the ordinary least squares (OLS) estimator and some other competing estimators when some of the regressors in the linear regression model are correlated, have been studied with respect to the mean squared error criterion. A simulat...
A nonlinear regression model with correlated, normally distributed errors is investigated. The bias and the mean square error matrix of the approximate least squares estimator of regression parameters are derived and their limit properties are studied.
Ridge regression is often favored in the analysis of ill-conditioned systems. A canonical form identifies regions in the parameter space where Ordinary Least Squares (OLS) is problematic. The objectives are two-fold: To reexamine the view that ill-conditioning necessarily degrades essentials of OLS; and to reassess ranges of the ridge parameter k where ridge is efficient in mean squared error (...
This study is about the development of a robust ridge regression estimator. It is based on weighted ridge MM-estimator (WRMM) and is believed to have potentials in remedying the problems of multicollinearity. The proposed method has been compared with several existing estimators, namely ordinary least squares (OLS), robust regression based on MM estimator, ridge regression (RIDGE), weighted rid...
Commonly used sums-of-squares-based error or deviation statisticsdlike the standard deviation, the standard error, the coefficient of variation, and the root-mean-square errordoften are misleading indicators of average error or variability. Sums-of-squares-based statistics are functions of at least two dissimilar patterns that occur within data. Both the mean of a set of error or deviation magn...
An improved robust variable step-size least mean square (LMS) algorithm is developed in this paper. Unlike many existing approaches, we adjust the variable step-size using a quotient form of filtered versions of the quadratic error. The filtered estimates of the error are based on exponential windows, applying different decaying factors for the estimations in the numerator and denominator. The ...
A fundamental relationship exists between the quality of an adaptive solution and the amount of data used in obtaining it. Quality is defined here in terms of “misadjustment,” the ratio of the excess mean square error (mse) in an adaptive solution to the min imum possible mse. The higher the misadjustment, the lower the quality is. The quality of the exact least squares solution is compared wit...
Autoregressive (AR) modeling is widely used in signal processing. The coefficients of an AR model can be easily obtained with a least mean square (LMS) prediction error filter. However, it is known that this filter gives a biased solution when the input signal is corrupted by white Gaussian noise. Treichler suggested the -LMS algorithm to remedy this problem and proved that the mean weight vect...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید