نتایج جستجو برای: bagging model

تعداد نتایج: 2105681  

2007
Christophe Croux Kristel Joossens Aurélie Lemmens K. Joossens

Bagging has been found to be successful in increasing the predictive performance of unstable classifiers. Bagging draws bootstrap samples from the training sample, applies the classifier to each bootstrap sample, and then averages over all obtained classification rules. The idea of trimmed bagging is to exclude the bootstrapped classification rules that yield the highest error rates, as estimat...

Journal: :Statistical Analysis and Data Mining 2008
Shohei Hido Hisashi Kashima

Imbalanced class problems appear in many real applications of classification learning. We propose a novel sampling method to improve bagging for data sets with skewed class distributions. In our new sampling method “Roughly Balanced Bagging” (RB Bagging), the number of samples in the largest and smallest classes are different, but they are effectively balanced when averaged over all subsets, wh...

Journal: :Entropy 2018
Hossein Foroozand Valentina Radic Steven V. Weijs

Recently, the Entropy Ensemble Filter (EEF) method was proposed to mitigate the computational cost of the Bootstrap AGGregatING (bagging) method. This method uses the most informative training data sets in the model ensemble rather than all ensemble members created by the conventional bagging. In this study, we evaluate, for the first time, the application of the EEF method in Neural Network (N...

2014
Vikas Singh Madhavi Ajay Pradhan

If we look a few years back, we will find that ensemble classification model has outbreak many research and publication in the data mining community discussing how to combine models or model prediction with reduction in the error that results. When we ensemble the prediction of more than one classifier, more accurate and robust models are generated. We have convention that bagging, boosting wit...

2012
Meishan Zhang Wanxiang Che Ting Liu

We describe our method of traditional Phrase Structure Grammar (PSG) parsing in CIPS-Bakeoff2012 Task3. First, bagging is proposed to enhance the baseline performance of PSG parsing. Then we suggest exploiting another TreeBank (CTB7.0) to improve the performance further. Experimental results on the development data set demonstrate that bagging can boost the baseline F1 score from 81.33% to 84.4...

Journal: :Data Knowl. Eng. 2008
Zoran Bosnic Igor Kononenko

The paper compares different approaches to estimate the reliability of individual predictions in regression. We compare the sensitivity-based reliability estimates developed in our previous work with four approaches found in the literature: variance of bagged models, local cross-validation, density estimation, and local modeling. By combining pairs of individual estimates, we compose a combined...

2000
Daniel Grossman Tammy Williams

Two learning ensemble methods, Bagging and Boosting, have been applied to decision trees to improve classification accuracy over that of a single decision tree learner. We introduce Bagging and propose a variant of it — Improved Bagging — which, in general, outperforms the original bagging algorithm. We experiment on 22 datasets from the UCI repository, with emphasis on the ensemble’s accuracy ...

2005
Yuk Lai Suen Prem Melville Raymond J. Mooney

Gradient Boosting and bagging applied to regressors can reduce the error due to bias and variance respectively. Alternatively, Stochastic Gradient Boosting (SGB) and Iterated Bagging (IB) attempt to simultaneously reduce the contribution of both bias and variance to error. We provide an extensive empirical analysis of these methods, along with two alternate bias-variance reduction approaches — ...

Journal: :Computational Statistics & Data Analysis 2007
Christophe Croux Kristel Joossens Aurélie Lemmens

Bagging has been found to be successful in increasing the predictive performance of unstable classifiers. Bagging draws bootstrap samples from the training sample, applies the classifier to each bootstrap sample, and then averages over all obtained classification rules. The idea of trimmed bagging is to exclude the bootstrapped classification rules that yield the highest error rates, as estimat...

Journal: :Statistics and Its Interface 2016

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید