نتایج جستجو برای: bagging

تعداد نتایج: 2077  

2011
Guohua Liang Xingquan Zhu Chengqi Zhang

Bagging is a simple, yet effective design which combines multiple base learners to form an ensemble for prediction. Despite its popular usage in many real-world applications, existing research is mainly concerned with studying unstable learners as the key to ensure the performance gain of a bagging predictor, with many key factors remaining unclear. For example, it is not clear when a bagging p...

Journal: :Neural Computation 1997
Michiaki Taniguchi Volker Tresp

We compare the performance of averaged regularized estimators. We show that the improvement in performance which can be achieved by averaging depends critically on the degree of regularization which is used in training the individual estimators. We compare four different averaging approaches: simple averaging, bagging, variance-based weighting and variance-based bagging. In any of the averaging...

2012
Sotiris B. Kotsiantis

Bagging and boosting are among the most popular resampling ensemble methods that generate and combine a diversity of regression models using the same learning algorithm as base-learner. Boosting algorithms are considered stronger than bagging on noisefree data. However, there are strong empirical indications that bagging is much more robust than boosting in noisy settings. For this reason, in t...

Journal: :Computational Statistics & Data Analysis 2007

Journal: :The Annals of Statistics 2002

Journal: :Pattern Recognition Letters 2007
Gonzalo Martínez-Muñoz Alberto Suárez

Boosting is used to determine the order in which classifiers are aggregated in a bagging ensemble. Early stopping in the aggregation of the classifiers in the ordered bagging ensemble allows the identification of subensembles that require less memory for storage, have a faster classification speed and can perform better than the original bagging ensemble. Furthermore, ensemble pruning does not ...

2004
Tae-Hwy Lee Yang Yang

Bootstrap aggregating or Bagging, introduced by Breiman (1996a), has been proved to be effective to improve on unstable forecast. Theoretical and empirical works using classification, regression trees, variable selection in linear and non-linear regression have shown that bagging can generate substantial prediction gain. However, most of the existing literature on bagging have been limited to t...

2007
Christophe Croux Kristel Joossens Aurélie Lemmens K. Joossens

Bagging has been found to be successful in increasing the predictive performance of unstable classifiers. Bagging draws bootstrap samples from the training sample, applies the classifier to each bootstrap sample, and then averages over all obtained classification rules. The idea of trimmed bagging is to exclude the bootstrapped classification rules that yield the highest error rates, as estimat...

Journal: :Statistical Analysis and Data Mining 2008
Shohei Hido Hisashi Kashima

Imbalanced class problems appear in many real applications of classification learning. We propose a novel sampling method to improve bagging for data sets with skewed class distributions. In our new sampling method “Roughly Balanced Bagging” (RB Bagging), the number of samples in the largest and smallest classes are different, but they are effectively balanced when averaged over all subsets, wh...

2004
Yang Liu Elizabeth Shriberg Andreas Stolcke Mary P. Harper

We investigate machine learning techniques for coping with highly skewed class distributions in two spontaneous speech processing tasks. Both tasks, sentence boundary and disfluency detection, provide important structural information for downstream language processing modules. We examine the effect of data set size, task, sampling method (no sampling, downsampling, oversampling, and ensemble sa...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید