نتایج جستجو برای: bic criteria
تعداد نتایج: 259823 فیلتر نتایج به سال:
10 The adaptive lasso is a commonly applied penalty for variable selection in regression mod11 eling. Like all penalties though, its performance depends critically on the choice of tuning 12 parameter. One method for choosing the tuning parameter is via information criteria, such as 13 those based on AIC and BIC. However, these criteria were developed for use with unpenal14 ized maximum likelih...
Several recent papers have suggested that two-locus tests of association that incorporate gene x gene interaction can be more powerful than marginal, single-locus tests across a broad range of multilocus interaction models, even after conservative correction for multiple testing. However, because these two-locus tests are sensitive to marginal associations with either marker, they can be diffic...
In today’s digital world, the most inevitable challenge is protection of information. Due to weak confidentiality preserving techniques, existing world facing several information breaches. To make our data indecipherable unauthorized person, a technique for finding cryptographically strong Substitution box (S-box) have presented. An S-box with sound cryptographic assets such as nonlinearity (NL...
One of the most popular criteria for model selection is the Bayesian Information Criterion (BIC). It is based on an asymptotic approximation using Bayes rule when the sample size tends to infinity and the dimension of the model is fixed. Although it works well in classical applications, it performs less satisfactorily for high dimensional problems, i.e. when the number of regressors is very lar...
For high-dimensional data sets with complicated dependency structures, the full likelihood approach often leads to intractable computational complexity. This imposes difficulty on model selection, given that most traditionally used information criteria require evaluation of the full likelihood. We propose a composite likelihood version of the Bayes information criterion (BIC) and establish its ...
Two methods for clustering data and choosing a mixture model are proposed. First, we derive a new classification algorithm based on the classification likelihood. Then, the likelihood conditional on these clusters is written as the product of likelihoods of each cluster, and AICrespectively BIC-type approximations are applied. The resulting criteria turn out to be the sum of the AIC or BIC rela...
In this paper we examine the performance of two newly developed procedures that jointly select the number of states and variables in Markov-switching models by means of Monte Carlo simulations. They are Smith, Naik and Tsai (2006) and Psaradakis and Spagnolo (2006), respectively. The former develops Markov switching criterion (MSC) designed specifically for Markov-switching models, while the la...
We study new logistic model selection criteria based on pvalues. The rules are proved to be consistent provided suitable assumptions on design matrix and scaling constants are satisfied and the search is performed over the family of all submodels. As a byproduct, consistency of Bayesian Information Criterion (BIC) for logistic regression models proved by Qian and Field in [11] is obtained under...
For the problem of variable selection in generalized linear models, we develop various adaptive Bayesian criteria. Using a hierarchical mixture setup for model uncertainty, combined with an integrated Laplace approximation, we derive Empirical Bayes and Fully Bayes criteria that can be computed easily and quickly. The performance of these criteria is assessed via simulation and compared to othe...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید