Default Bayes Factors for Model Selection in Regression.

نویسندگان

  • Jeffrey N Rouder
  • Richard D Morey
چکیده

In this article, we present a Bayes factor solution for inference in multiple regression. Bayes factors are principled measures of the relative evidence from data for various models or positions, including models that embed null hypotheses. In this regard, they may be used to state positive evidence for a lack of an effect, which is not possible in conventional significance testing. One obstacle to the adoption of Bayes factor in psychological science is a lack of guidance and software. Recently, Liang, Paulo, Molina, Clyde, and Berger (2008) developed computationally attractive default Bayes factors for multiple regression designs. We provide a web applet for convenient computation and guidance and context for use of these priors. We discuss the interpretation and advantages of the advocated Bayes factor evidence measures.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Bayes Factors 1 Running head: BAYES FACTORS Default Bayes Factors for Model Selection in Regression

In this paper, we present a Bayes factor solution for inference in multiple regression. Bayes factors are principled measures of the relative evidence from data for various models or positions, including models that embed null hypotheses. In this regard, they may be used to state positive evidence for a lack of an effect, which is not possible in conventional significance testing. One obstacle ...

متن کامل

مقایسه روش های مختلف آماری در انتخاب ژنومی گاوهای هلشتاین

Genomic selection combines statistical methods with genomic data to predict genetic values for complex traits.  The accuracy of prediction of genetic values ​​in selected population has a great effect on the success of this selection method. Accuracy of genomic prediction is highly dependent on the statistical model used to estimate marker effects in reference population. Various factors such a...

متن کامل

Model weights and the foundations of multimodel inference.

Statistical thinking in wildlife biology and ecology has been profoundly influenced by the introduction of AIC (Akaike's information criterion) as a tool for model selection and as a basis for model averaging. In this paper, we advocate the Bayesian paradigm as a broader framework for multimodel inference, one in which model averaging and model selection are naturally linked, and in which the p...

متن کامل

Approximate Bayesian Model Selection with the Deviance Statistic

Bayesian model selection poses two main challenges: the specification of parameter priors for all models, and the computation of the resulting Bayes factors between models. There is now a large literature on automatic and objective parameter priors in the linear model. One important class are g-priors, which were recently extended from linear to generalized linear models (GLMs). We show that th...

متن کامل

Objective Bayesian Model Selection in Gaussian Graphical Models

This paper presents a default model-selection procedure for Gaussian graphical models that involves two new developments. First, we develop an objective version of the hyper-inverse Wishart prior for restricted covariance matrices, called the HIW g-prior, and show how it corresponds to the implied fractional prior for covariance selection using fractional Bayes factors. Second, we apply a class...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • Multivariate behavioral research

دوره 47 6  شماره 

صفحات  -

تاریخ انتشار 2012