Power-Conditional-Expected Priors: Using g-Priors With Random Imaginary Data for Variable Selection

نویسندگان
چکیده

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Mixtures of g-priors for Bayesian Variable Selection

Zellner’s g-prior remains a popular conventional prior for use in Bayesian variable selection, despite several undesirable consistency issues. In this paper, we study mixtures of g-priors as an alternative to default g-priors that resolve many of the problems with the original formulation, while maintaining the computational tractability that has made the g prior so popular. We present theoreti...

متن کامل

Mixtures of g-priors for Bayesian Variable Selection

Zellner’s g-prior remains a popular conventional prior for use in Bayesian variable selection, despite several undesirable consistency issues. In this paper, we study mixtures of g-priors as an alternative to default g-priors that resolve many of the problems with the original formulation, while maintaining the computational tractability that has made the g-prior so popular. We present theoreti...

متن کامل

Power-Expected-Posterior Priors for Variable Selection in Gaussian Linear Models

Imaginary training samples are often used in Bayesian statistics to develop prior distributions, with appealing interpretations, for use in model comparison. Expected-posterior priors are defined via imaginary training samples coming from a common underlying predictive distribution m, using an initial baseline prior distribution. These priors can have subjective and also default Bayesian implem...

متن کامل

Variable selection for discriminant analysis with Markov random field priors for the analysis of microarray data

MOTIVATION Discriminant analysis is an effective tool for the classification of experimental units into groups. Here, we consider the typical problem of classifying subjects according to phenotypes via gene expression data and propose a method that incorporates variable selection into the inferential procedure, for the identification of the important biomarkers. To achieve this goal, we build u...

متن کامل

Random Balance: Ensembles of variable priors classifiers for imbalanced data

In Machine Learning, a data set is imbalanced when the class proportions are highly skewed. Imbalanced data sets arise routinely in many application domains and pose a challenge to traditional classifiers. We propose a new approach to building ensembles of classifiers for two-class imbalanced data sets, called Random Balance. Each member of the Random Balance ensemble is trained with data sampl...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Journal of Computational and Graphical Statistics

سال: 2016

ISSN: 1061-8600,1537-2715

DOI: 10.1080/10618600.2015.1036996