نتایج جستجو برای: bic criteria
تعداد نتایج: 259823 فیلتر نتایج به سال:
It is well-known that constrained Hebbian self-organization on multiple linear neural units leads to the same k-dimensional subspace spanned by the first k principal components. Not only the batch PCA algorithm has been widely applied in various fields since 1930s, but also a variety of adaptive algorithms have been proposed in the past two decades. However, most studies assume a known dimensio...
It is well known that AIC and BIC have different properties in model selection. BIC is consistent in the sense that if the true model is among the candidates, the probability of selecting the true model approaches 1. On the other hand, AIC is minimax-rate optimal for both parametric and nonparametric cases for estimating the regression function. There are several successful results on construct...
We investigate the asymptotic and finite sample properties of the most widely used information criteria for co-integration rank determination in ‘partial’ systems, i.e. in cointegrated Vector Autoregressive (VAR) models in which a sub-set of variables of interest is modeled conditional on another sub-set of variables. The asymptotic properties of the Akaike Information Criterion (AIC), the Baye...
The purpose of this article is to look at how information criteria, such as AIC and BIC, interact with the g%SD fit criterion derived in Waddell et al. (2007, 2010a). The g%SD criterion measures the fit of data to model based on a normalized weighted root mean square percentage deviation between the observed data and model estimates of the data, with g%SD = 0 being a perfectly fitting model. Ho...
In this paper, we propose a new Empirical Information Criterion (EIC) for model selection which penalizes the likelihood of the data by a function of the number of parameters in the model. It is designed to be used where there are a large number of time series to be forecast. However, a bootstrap version of the EIC can be used where there is a single time series to be forecast. The EIC provides...
In this paper, we propose a new Empirical Information Criterion (EIC) for model selection which penalizes the likelihood of the data by a function of the number of parameters in the model. It is designed to be used where there are a large number of time series to be forecast. However, a bootstrap version of the EIC can be used where there is a single time series to be forecast. The EIC provides...
SUMMARY Based on the marginal likelihood approach, we develop a model selection criterion, MIC, for regression models with the general variance structure. These include weighted regression models, regression models with ARMA errors, growth curve models, and spatial correlation models. We show that MIC is a consistent criterion. For regression models with either constant or non-constant variance...
We introduce the Partition Negentropy Criterion (PNC) for cluster validation. It is a cluster validity index that rewards the average normality of the clusters, measured by means of the negentropy, and penalizes the overlap, measured by the partition entropy. The PNC is aimed at finding well separated clusters whose shape is approximately Gaussian. We use the new index to validate fuzzy partiti...
The selection of the truncation lag for covariate unit root tests is analyzed using Monte Carlo simulation. It is shown that standard information criteria such as the BIC or the AIC can result in tests with large size distortions. Modified information criteria can be used to construct tests with good size and power. An empirical illustration is provided.
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید