The evidence framework applied to sparse kernel logistic regression

نویسندگان

  • Gavin C. Cawley
  • Nicola L. C. Talbot
چکیده

In this paper we present a simple hierarchical Bayesian treatment of the sparse kernel logistic regression (KLR) model based on the evidence framework introduced by MacKay. The principal innovation lies in the re-parameterisation of the model such that the usual spherical Gaussian prior over the parameters in the kernel induced feature space also corresponds to a spherical Gaussian prior over the transformed parameters, permitting the straight-froward derivation of an efficient update formula for the regularisation parameter. The Bayesian framework also allows the selection of good values for kernel parameters through maximisation of the marginal likelihood, or evidence, for the model. Results obtained on a variety of benchmark datasets are provided indicating that the Bayesian kernel logistic regression model is competitive with kernel logistic regression models, where the hyper-parameters are selected via cross-validation and with the support vector machine and relevance vector machine.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Sparse Bayesian kernel logistic regression

In this paper we present a simple hierarchical Bayesian treatment of the sparse kernel logistic regression (KLR) model based MacKay’s evidence approximation. The model is re-parameterised such that an isotropic Gaussian prior over parameters in the kernel induced feature space is replaced by an isotropic Gaussian prior over the transformed parameters, facilitating a Bayesian analysis using stan...

متن کامل

A Gradient-based Forward Greedy Algorithm for Sparse Gaussian Process Regression

In this chaper, we present a gradient-based forward greedy method for sparse approximation of Bayesian Gaussian Process Regression (GPR) model. Different from previous work, which is mostly based on various basis vector selection strategies, we propose to construct instead of select a new basis vector at each iterative step. This idea was motivated from the well-known gradient boosting approach...

متن کامل

Kernel Logistic Regression Algorithm for Large-Scale Data Classification

Kernel Logistic Regression (KLR) is a powerful classification technique that has been applied successfully in many classification problems. However, it is often not found in large-scale data classification problems and this is mainly because it is computationally expensive. In this paper, we present a new KLR algorithm based on Truncated Regularized Iteratively Reweighted Least Squares(TR-IRLS)...

متن کامل

A Gradient-Based Forward Greedy Algorithm for Space Gaussian Process Regression

In this chaper, we present a gradient-based forward greedy method for sparse approximation of Bayesian Gaussian Process Regression (GPR) model. Different from previous work, which is mostly based on various basis vector selection strategies, we propose to construct instead of select a new basis vector at each iterative step. This idea was motivated from the well-known gradient boosting approach...

متن کامل

Adaptive spherical Gaussian kernel in sparse Bayesian learning framework for nonlinear regression

Kernel based machine learning techniques have been widely used to tackle problems of function approximation and regression estimation. Relevance vector machine (RVM) has state of the art performance in sparse regression. As a popular and competent kernel function in machine learning, conventional Gaussian kernel has unified kernel width with each of basis functions, which make impliedly a basic...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • Neurocomputing

دوره 64  شماره 

صفحات  -

تاریخ انتشار 2005