نتایج جستجو برای: کمیته bagging

تعداد نتایج: 4107  

2000
Andreas Buja Werner Stuetzle

Bagging is a device intended for reducing the prediction error of learning algorithms. In its simplest form, bagging draws bootstrap samples from the training sample, applies the learning algorithm to each bootstrap sample, and then averages the resulting prediction rules. We study the von Mises expansion of a bagged statistical functional and show that it is related to the Stein-Efron ANOVA ex...

1996
J. Ross Quinlan

Breiman's bagging and Freund and Schapire's boosting are recent methods for improving the predictive power of classiier learning systems. Both form a set of classiiers that are combined by voting, bagging by generating replicated boot-strap samples of the data, and boosting by adjusting the weights of training instances. This paper reports results of applying both techniques to a system that le...

2011
Quan Sun Bernhard Pfahringer

Ensemble selection has recently appeared as a popular ensemble learning method, not only because its implementation is fairly straightforward, but also due to its excellent predictive performance on practical problems. The method has been highlighted in winning solutions of many data mining competitions, such as the Netflix competition, the KDD Cup 2009 and 2010, the UCSD FICO contest 2010, and...

1998
Bruce A. Draper Kyungim Baek

Previous research has shown that aggregated predictors improve the performance of non-parametric function approximation techniques. This paper presents the results of applying aggregated predictors to a computer vision problem, and shows that the method of bagging signi cantly improves performance. In fact, the results are better than those previously reported on other domains. This paper expla...

2003
J R Quinlan

Breiman s bagging and Freund and Schapire s boosting are recent methods for improving the predictive power of classi er learning systems Both form a set of classi ers that are combined by voting bagging by generating replicated boot strap samples of the data and boosting by ad justing the weights of training instances This paper reports results of applying both techniques to a system that learn...

2016
Olcay Taner Yildiz Ozan Irsoy Ethem Alpaydin

The decision tree is one of the earliest predictive models in machine learning. In the soft decision tree, based on the hierarchical mixture of experts model, internal binary nodes take soft decisions and choose both children with probabilities given by a sigmoid gating function. Hence for an input, all the paths to all the leaves are traversed and all those leaves contribute to the final decis...

2006
Anneleen Van Assche Hendrik Blockeel

Bagging is a well-known and widely used ensemble method. It operates by sequentially bootstrapping the data set and invoking a base classifier on these different bootstraps. By learning several models (and combining them), it tends to increase predictive accuracy, while sacrificing efficiency. Due to this it becomes slow for large scale data sets. In this paper we propose a method that simulate...

2006
Katerina Taškova Panče Panov Andrej Kobler Sašo Džeroski Daniela Stojanova

This paper work is focused on the comparison of different data mining techniques and their performances by building predictive models of forest stand properties from satellite images. We used the WEKA data mining environment to implement our numeric prediction experiments, applying linear regression, model (regression) trees, and bagging. The best results (with regard to correlation) we obtaine...

2010
Lei Zhang Guiquan Liu Xuechen Zhang Song Jiang Enhong Chen

Storage device performance prediction is a key element of self-managed storage systems and application planning tasks, such as data assignment and configuration. Based on bagging ensemble, we proposed an algorithm named selective bagging classification and regression tree (SBCART) to model storage device performance. In addition, we consider the caching effect as a feature in workload character...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید