نتایج جستجو برای: least squares weighted residual method

تعداد نتایج: 2100551  

Journal: :IEEE Trans. Signal Processing 1998
Yasemin Yardimci A. Enis Çetin James A. Cadzow

In this correspondence, a nonlinearly weighted least-squares method is developed for robust modeling of sensor array data. Weighting functions for various observation noise scenarios are determined using maximum likelihood estimation theory. Computational complexity of the new method is comparable with the standard least-squares estimation procedures. Simulation examples of direction-of-arrival...

Journal: :Journal of Statistical Computation and Simulation 2016

Journal: :International Journal of Fuzzy Systems 2021

In the classical leave-one-out procedure for outlier detection in regression analysis, we exclude an observation and then construct a model on remaining data. If difference between predicted observed value is high declare this outlier. As rule, those procedures utilize single comparison testing. The problem becomes much harder when observations can be associated with given degree of membership ...

Journal: :Automatica 1995
Zhuquan Zang Robert R. Bitmead Michel Gevers

Abstrart-Many practical applications of control system design based on input-output measurements permit the repeated application of a system identification procedure operating on closed-loop data together with successive refinements of the designed controller. Here we develop a paradigm for such an iterative design. The key to the procedure is to account for evaluated modelling error in the con...

Journal: :Computational Statistics & Data Analysis 2006
Wim P. Krijnen

Several models in data analysis are estimated by minimizing the objective function defined as the residual sum of squares between the model and the data.A necessary and sufficient condition for the existence of a least squares estimator is that the objective function attains its infimum at a unique point. It is shown that the objective function for Parafac-2 need not attain its infimum, and tha...

2010
Hector Corrada Bravo Rafael A. Irizarry

Y = β0 + (β1 + β2)X1 + and we may get a good estimate of Y estimating 2 parameters instead of 3. Our estimate will be a bit biased but we may lower our variance considerably creating an estimate with smaller expected prediciton error than the least squares estimate. We won’t be able to interpret the estimated parameter, but our prediction may be good. In subset selection regression we select a ...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید