A Bootstrap Variant of Aic for State-space Model Selection

نویسندگان

  • Joseph E. Cavanaugh
  • Robert H. Shumway
  • JOSEPH E. CAVANAUGH
  • ROBERT H. SHUMWAY
چکیده

Following in the recent work of Hurvich and Tsai (1989, 1991, 1993) and Hurvich, Shumway, and Tsai (1990), we propose a corrected variant of AIC developed for the purpose of small-sample state-space model selection. Our variant of AIC utilizes bootstrapping in the state-space framework (Stoffer and Wall (1991)) to provide an estimate of the expected Kullback-Leibler discrepancy between the model generating the data and a fitted approximating model. We present simulation results which demonstrate that in small-sample settings, our criterion estimates the expected discrepancy with less bias than traditional AIC and certain other competitors. As a result, our AIC variant serves as an effective tool for selecting a model of appropriate dimension. We present an asymptotic justification for our criterion in the Appendix.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Bootstrap Estimate of Kullback-leibler Information for Model Selection

Estimation of Kullback-Leibler information is a crucial part of deriving a statistical model selection procedure which, like AIC, is based on the likelihood principle. To discriminate between nested models, we have to estimate KullbackLeibler information up to the order of a constant, while Kullback-Leibler information itself is of the order of the number of observations. A correction term empl...

متن کامل

Bootstrap Estimate of Kullback-leibler Information for Model Selection Bootstrap Estimate of Kullback-leibler Information for Model Selection

Estimation of Kullback-Leibler amount of information is a crucial part of deriving a statistical model selection procedure which is based on likelihood principle like AIC. To discriminate nested models, we have to estimate it up to the order of constant while the Kullback-Leibler information itself is of the order of the number of observations. A correction term employed in AIC is an example to...

متن کامل

Asymptotic bootstrap corrections of AIC for linear regression models

The Akaike information criterion, AIC, and its corrected version, AICc are two methods for selecting normal linear regression models. Both criteria were designed as estimators of the expected Kullback–Leibler information between the model generating the data and the approximating candidate model. In this paper, two new corrected variants of AIC are derived for the purpose of small sample linear...

متن کامل

Bootstrapping Likelihood for Model Selection with Small Samples

Akaike s Information Criterion AIC derived from asymptotics of the maximum like lihood estimator is widely used in model selection However it has a nite sample bias which produces over tting in linear regression To deal with this problem Ishiguro et al proposed a bootstrap based extension to AIC which they call EIC In this report we compare model selection performance of AIC EIC a bootstrap smo...

متن کامل

An Akaike Information Criterion for Model Selection in the Presence of Incomplete Data Title: an Aic for Model Selection with Incomplete Data

We derive and investigate a variant of AIC, the Akaike information criterion, for model selection in settings where the observed data is incomplete. Our variant is based on the motivation provided for the PDIO (\predictive divergence for incomplete observation models") criterion of Shimodaira variant diiers from PDIO in its \goodness-of-t" term. Unlike AIC and PDIO, which require the computatio...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1997