Asymptotically optimal differenced estimators of error variance in nonparametric regression

نویسندگان

  • WenWu Wang
  • Ping Yu
چکیده

The existing differenced estimators of error variance in nonparametric regression are interpreted as kernel estimators, and some requirements for a ‘‘good’’ estimator of error variance are specified. A new differenced method is then proposed that estimates the errors as the intercepts in a sequence of simple linear regressions and constructs a variance estimator based on estimated errors. The new estimator satisfies the requirements for a ‘‘good’’ estimator and achieves the asymptotically optimal mean square error. A feasible difference order is also derived, which makes the estimator more applicable. To improve the finite-sample performance, two bias-corrected versions are further proposed. All three estimators are equivalent to some local polynomial estimators and thus can be interpreted as kernel estimators. To determine which of the three estimators to be used in practice, a rule of thumb is provided by analysis of the mean square error, which solves an open problem in error variance estimation which difference sequence to be used in finite samples. Simulation studies and a real data application corroborate the theoretical results and illustrate the advantages of the new method compared with the existing methods. © 2016 Elsevier B.V. All rights reserved.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Optimal variance estimation based on lagged second-order difference in nonparametric regression

Differenced estimators of variance bypass the estimation of regression function and thus are simple to calculate. However, there exist two problems: most differenced estimators do not achieve the asymptotic optimal rate for the mean square error; for finite samples the estimation bias is also important and not further considered. In this paper, we estimate the variance as the intercept in a lin...

متن کامل

Estimating the error variance in nonparametric regression by a covariate-matched U-statistic

For nonparametric regression models with fixed and random design, two classes of estimators for the error variance have been introduced: second sample moments based on residuals from a nonparametric fit, and difference-based estimators. The former are asymptotically optimal but require estimating the regression function; the latter are simple but have larger asymptotic variance. For nonparametr...

متن کامل

Differenced-Based Double Shrinking in Partial Linear Models

Partial linear model is very flexible when the relation between the covariates and responses, either parametric and nonparametric. However, estimation of the regression coefficients is challenging since one must also estimate the nonparametric component simultaneously. As a remedy, the differencing approach, to eliminate the nonparametric component and estimate the regression coefficients, can ...

متن کامل

Nonparametric Inference Relative Errors of Difference-Based Variance Estimators in Nonparametric Regression

Difference-based estimators for the error variance are popular since they do not require the estimation of the mean function. Unlike most existing difference-based estimators, new estimators proposed by Müller et al. (2003) and Tong and Wang (2005) achieved the asymptotic optimal rate as residual-based estimators. In this article, we study the relative errors of these difference-based estimator...

متن کامل

Optimal Difference-based Variance Estimation in Heteroscedastic Nonparametric Regression

Estimating the residual variance is an important question in nonparametric regression. Among the existing estimators, the optimal difference-based variance estimation proposed in Hall, Kay, and Titterington (1990) is widely used in practice. Their method is restricted to the situation when the errors are independent and identically distributed. In this paper, we propose the optimal difference-b...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • Computational Statistics & Data Analysis

دوره 105  شماره 

صفحات  -

تاریخ انتشار 2017