L 2 Boosting in Kernel Regression 1

نویسندگان

  • B. U. Park
  • Y. K. Lee
چکیده

In this paper, we investigate the theoretical and empirical properties of L2 boosting with kernel regression estimates as weak learners. We show that each step of L2 boosting reduces the bias of the estimate by two orders of magnitude, while it does not deteriorate the order of the variance. We illustrate the theoretical findings by some simulated examples. Also, we demonstrate that L2 boosting is superior to the use of higher-order kernels which is a well-known method of reducing the bias of the kernel estimate. Some key words: Kernel regression, Nadaraya-Watson smoother, bias reduction, boosting, twicing

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

L2 boosting in kernel regression

In this paper, we investigate the theoretical and empirical properties of L2 boosting with kernel regression estimates as weak learners. We show that each step of L2 boosting reduces the bias of the estimate by two orders of magnitude, while it does not deteriorate the order of the variance. We illustrate the theoretical findings by some simulated examples. Also, we demonstrate that L2 boosting...

متن کامل

Support Vector Machines, Kernel Logistic Regression and Boosting

The support vector machine is known for its excellent performance in binary classification, i.e., the response y ∈ {−1, 1}, but its appropriate extension to the multi-class case is still an on-going research issue. Another weakness of the SVM is that it only estimates sign[p(x) − 1/2], while the probability p(x) is often of interest itself, where p(x) = P (Y = 1|X = x) is the conditional probab...

متن کامل

Sparse Regression Modelling Using an Incremental Weighted Optimization Method Based on Boosting with Correlation Criterion

ABSTRACT A novel technique is presented to construct sparse Gaussian regression models. Unlike most kernel regression modelling methods, which restrict kernel means to the training input data and use a fixed common variance for all the regressors, the proposed technique can tune the mean vector and diagonal covariance matrix of individual Gaussian regressor to best fit the training data based o...

متن کامل

Boosting Based Multiple Kernel Learning and Transfer Regression for Electricity Load Forecasting

Accurate electricity load forecasting is of crucial importance for power system operation and smart grid energy management. Different factors, such as weather conditions, lagged values, and day types may affect electricity load consumption. We propose to use multiple kernel learning (MKL) for electricity load forecasting, as it provides more flexibilities than traditional kernel methods. Comput...

متن کامل

Kernel Feature Selection to Improve Generalization Performance of Boosting Classifiers

In this paper, kernel feature selection is proposed to improve generalization performance of boosting classifiers. Kernel feature Selection attains the feature selection and model selection at the same time using a simple selection algorithm. The algorithm automatically selects a subset of kernel features for each classifier and combines them according to the LogitBoost algorithm. The system em...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2008