The evidence framework applied to sparse kernel logistic regression
نویسندگان
چکیده
In this paper we present a simple hierarchical Bayesian treatment of the sparse kernel logistic regression (KLR) model based on the evidence framework introduced by MacKay. The principal innovation lies in the re-parameterisation of the model such that the usual spherical Gaussian prior over the parameters in the kernel induced feature space also corresponds to a spherical Gaussian prior over the transformed parameters, permitting the straight-froward derivation of an efficient update formula for the regularisation parameter. The Bayesian framework also allows the selection of good values for kernel parameters through maximisation of the marginal likelihood, or evidence, for the model. Results obtained on a variety of benchmark datasets are provided indicating that the Bayesian kernel logistic regression model is competitive with kernel logistic regression models, where the hyper-parameters are selected via cross-validation and with the support vector machine and relevance vector machine.
منابع مشابه
Sparse Bayesian kernel logistic regression
In this paper we present a simple hierarchical Bayesian treatment of the sparse kernel logistic regression (KLR) model based MacKay’s evidence approximation. The model is re-parameterised such that an isotropic Gaussian prior over parameters in the kernel induced feature space is replaced by an isotropic Gaussian prior over the transformed parameters, facilitating a Bayesian analysis using stan...
متن کاملA Gradient-based Forward Greedy Algorithm for Sparse Gaussian Process Regression
In this chaper, we present a gradient-based forward greedy method for sparse approximation of Bayesian Gaussian Process Regression (GPR) model. Different from previous work, which is mostly based on various basis vector selection strategies, we propose to construct instead of select a new basis vector at each iterative step. This idea was motivated from the well-known gradient boosting approach...
متن کاملKernel Logistic Regression Algorithm for Large-Scale Data Classification
Kernel Logistic Regression (KLR) is a powerful classification technique that has been applied successfully in many classification problems. However, it is often not found in large-scale data classification problems and this is mainly because it is computationally expensive. In this paper, we present a new KLR algorithm based on Truncated Regularized Iteratively Reweighted Least Squares(TR-IRLS)...
متن کاملA Gradient-Based Forward Greedy Algorithm for Space Gaussian Process Regression
In this chaper, we present a gradient-based forward greedy method for sparse approximation of Bayesian Gaussian Process Regression (GPR) model. Different from previous work, which is mostly based on various basis vector selection strategies, we propose to construct instead of select a new basis vector at each iterative step. This idea was motivated from the well-known gradient boosting approach...
متن کاملAdaptive spherical Gaussian kernel in sparse Bayesian learning framework for nonlinear regression
Kernel based machine learning techniques have been widely used to tackle problems of function approximation and regression estimation. Relevance vector machine (RVM) has state of the art performance in sparse regression. As a popular and competent kernel function in machine learning, conventional Gaussian kernel has unified kernel width with each of basis functions, which make impliedly a basic...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- Neurocomputing
دوره 64 شماره
صفحات -
تاریخ انتشار 2005