نتایج جستجو برای: کمیته bagging

تعداد نتایج: 4107  

Journal: :Journal of Statistical Planning and Inference 2007

2007
Frédéric RATLE Devis TUIA

This paper investigates the use of ensemble of predictors in order to improve the performance of spatial prediction methods. Support vector regression (SVR), a popular method from the field of statistical machine learning, is used. Several instances of SVR are combined using different data sampling schemes (bagging and boosting). Bagging shows good performance, and proves to be more computation...

2016
Andreas Buja Werner Stuetzle

Bagging is a device intended for reducing the prediction error of learning algorithms. In its simplest form, bagging draws bootstrap samples from the training sample, applies the learning algorithm to each bootstrap sample, and then averages the resulting prediction rules. We extend the definition of bagging from statistics to statistical functionals and study the von Mises expansion of bagged ...

2015
Gianni Franchi Jesús Angulo

The stochastic watershed is a probabilistic segmentation approach which estimates the probability density of contours of the image from a given gradient. In complex images, the stochastic watershed can enhance insignificant contours. To partially address this drawback, we introduce here a fully unsupervised multi-scale approach including bagging. Re-sampling and bagging is a classical stochasti...

Journal: :Pattern Recognition 2010
Gonzalo Martínez-Muñoz Alberto Suárez

The performance of m-out-of-n bagging with and without replacement in terms of the sampling ratio (m/n) is analyzed. Standard bagging uses resampling with replacement to generate bootstrap samples of equal size as the original training set mwor = n. Without-replacement methods typically use half samples mwr = n/2. These choices of sampling sizes are arbitrary and need not be optimal in terms of...

2006
Zhuo Zheng

Boosting and bagging are two techniques for improving the performance of learning algorithms. Both techniques have been successfully used in machine learning to improve the performance of classification algorithms such as decision trees, neural networks. In this paper, we focus on the use of feedforward back propagation neural networks for time series classification problems. We apply boosting ...

2014
Maryam Sabzevari Gonzalo Martínez-Muñoz Alberto Suárez

Bagging is a simple and robust classification algorithm in the presence of class label noise. This algorithm builds an ensemble of classifiers by bootstrapping samples with replacement of size equal to the original training set. However, several studies have shown that this choice of sampling size is arbitrary in terms of generalization performance of the ensemble. In this study we discuss how ...

1996
Leo Breiman

Recent work has shown that combining multiple versions of unstable classifiers such as trees or neural nets results in reduced test set error. To study this, the concepts of bias and variance of a classifier are defined. Unstable classifiers can have universally low bias. Their problem is high variance. Combining multiple versions is a variance reducing device. One of the most effective is bagg...

2001
Terry Windeatt Gholamreza Ardeshir

Many researchers have shown that ensemble methods such as Boosting and Bagging improve the accuracy of classification. Boosting and Bagging perform well with unstable learning algorithms such as neural networks or decision trees. Pruning decision tree classifiers is intended to make trees simpler and more comprehensible and avoid over-fitting. However it is known that pruning individual classif...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید