Kernel Partial Least Squares Regression in Reproducing Kernel Hilbert Space
نویسندگان
چکیده
A family of regularized least squares regression models in a Reproducing Kernel Hilbert Space is extended by the kernel partial least squares (PLS) regression model. Similar to principal components regression (PCR), PLS is a method based on the projection of input (explanatory) variables to the latent variables (components). However, in contrast to PCR, PLS creates the components by modeling the relationship between input and output variables while maintaining most of the information in the input variables. PLS is useful in situations where the number of explanatory variables exceeds the number of observations and/or a high level of multicollinearity among those variables is assumed. Motivated by this fact we will provide a kernel PLS algorithm for construction of nonlinear regression models in possibly high-dimensional feature spaces. We give the theoretical description of the kernel PLS algorithm and we experimentally compare the algorithm with the existing kernel PCR and kernel ridge regression techniques. We will demonstrate that on the data sets employed kernel PLS achieves the same results as kernel PCR but uses significantly fewer, qualitatively different components.
منابع مشابه
Subspace Regression in Reproducing Kernel Hilbert Space
We focus on three methods for finding a suitable subspace for regression in a reproducing kernel Hilbert space: kernel principal component analysis, kernel partial least squares and kernel canonical correlation analysis and we demonstrate how this fits within a more general context of subspace regression. For the kernel partial least squares case a least squares support vector machine style der...
متن کاملKernel PLS variants for regression
We focus on covariance criteria for finding a suitable subspace for regression in a reproducing kernel Hilbert space: kernel principal component analysis, kernel partial least squares and kernel canonical correlation analysis, and we demonstrate how this fits within a more general context of subspace regression. For the kernel partial least squares case some variants are considered and the meth...
متن کاملReproducing Kernel Space Hilbert Method for Solving Generalized Burgers Equation
In this paper, we present a new method for solving Reproducing Kernel Space (RKS) theory, and iterative algorithm for solving Generalized Burgers Equation (GBE) is presented. The analytical solution is shown in a series in a RKS, and the approximate solution u(x,t) is constructed by truncating the series. The convergence of u(x,t) to the analytical solution is also proved.
متن کاملKernel Partial Least Squares is Universally Consistent
We prove the statistical consistency of kernel Partial Least Squares Regression applied to a bounded regression learning problem on a reproducing kernel Hilbert space. Partial Least Squares stands out of well-known classical approaches as e.g. Ridge Regression or Principal Components Regression, as it is not defined as the solution of a global cost minimization procedure over a fixed model nor ...
متن کاملKernel PLS Smoothing for Nonparametric Regression Curve Fitting: an Application to Event Related Potentials
We present a novel smoothing approach to nonparametric regression curve fitting. This is based on kernel partial least squares (PLS) regression in reproducing kernel Hilbert space. It is our interest to apply the methodology for smoothing experimental data, such as brain event related potentials, where some level of knowledge about areas of different degrees of smoothness, local inhomogeneities...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- Journal of Machine Learning Research
دوره 2 شماره
صفحات -
تاریخ انتشار 2001