نتایج جستجو برای: bagging

تعداد نتایج: 2077  

2003
Luis Daza

A combination of classification rules (classifiers) is known as an Ensemble, and in general it is more accurate than the individual classifiers used to build it. Two popular methods to construct an Ensemble are Bagging (Bootstrap aggregating) introduced by Breiman, [4] and Boosting (Freund and Schapire, [11]). Both methods rely on resampling techniques to obtain different training sets for each...

Journal: :Expert Syst. Appl. 2014
Joaquín Abellán Carlos Javier Mantas

Previous studies about ensembles of classifiers for bankruptcy prediction and credit scoring have been presented. In these studies, different ensemble schemes for complex classifiers were applied, and the best results were obtained using the Random Subspace method. The Bagging scheme was one of the ensemble methods used in the comparison. However, it was not correctly used. It is very important...

2009
Mohamad Adnan Al-Alaoui

The relation of the Al-Alaoui pattern recognition algorithm to the boosting and bagging approaches to pattern recognition is delineated. It is shown that the Al-Alaoui algorithm shares with bagging and boosting the concepts of replicating and weighting instances of the training set. Additionally it is shown that the Al-Alaoui algorithm provides a Mean Square Error, MSE, asymptotic Bayesian appr...

2006
Tian-Yu Liu Guo-Zheng Li Yue Liu Gengfeng Wu Wei Wang

Earthquakes will do great harms to the people, to estimate the future earthquake situation in Chinese mainland is still an open issue. There have been previous attempts to solve this problem by using artificial neural networks. In this paper, a novel algorithm named MIFEB is proposed to improve the estimation accuracy by combing bagging of neural networks with mutual information based feature s...

2016
Yousef M. T. El Gimati

Stratified sampling is often used in opinion polls to reduce standard errors, and it is known as variance reduction technique in sampling theory. The most common approach of resampling method is based on bootstrapping the dataset with replacement. A main purpose of this work is to investigate extensions of the resampling methods in classification problems, specifically we use decision trees, fr...

2003
Lawrence O. Hall Kevin W. Bowyer Robert E. Banfield Divya Bhadoria W. Philip Kegelmeyer Steven Eschrich

We experimentally evaluate bagging and seven other randomization-based approaches to creating an ensemble of decision-tree classifiers. Unlike methods related to boosting, all of the eight approaches create each classifier in an ensemble independently of the other classifiers in the ensemble. Bagging uses randomization to create multiple training sets. Other approaches, such as those of Dietter...

Journal: :Neurocomputing 2012
Jarek Krajewski Sebastian Schnieder David Sommer Anton Batliner Björn W. Schuller

Comparing different novel feature sets and classifiers for speech processing based fatigue detection is the primary aim of this study. Thus, we conducted a within-subject partial sleep deprivation design (20.00–04.00 h, N1⁄477 participants) and recorded 372 speech samples of sustained vowel phonation. The self-report on the Karolinska Sleepiness Scale (KSS) and an observer report on the KSS, th...

2011
Jarek Krajewski Sebastian Schnieder David Sommer Anton Batliner Björn Schuller

Comparing different novel feature sets and classifiers for speech processing based fatigue detectionis is the primary aim of this study. Thus, we conducted a within-subject partial sleep deprivation design (20.00 04.00 h, N = 77 participants) and recorded 372 speech samples of sustained vowel phonation. The self-report on the Karolinska Sleepiness Scale (KSS), and an observer report on the KSS,...

2004
Vicent Estruch César Ferri José Hernández-Orallo M. José Ramírez-Quintana

Ensemble methods improve accuracy by combining the predictions of a set of different hypotheses. A well-known method for generating hypothesis ensembles is Bagging. One of the main drawbacks of ensemble methods in general, and Bagging in particular, is the huge amount of computational resources required to learn, store, and apply the set of models. Another problem is that even using the bootstr...

2002
Philip Derbeko Ran El-Yaniv Ron Meir

We propose and study a new technique for aggregating an ensemble of bootstrapped classifiers. In this method we seek a linear combination of the base-classifiers such that the weights are optimized to reduce variance. Minimum variance combinations are computed using quadratic programming. This optimization technique is borrowed from Mathematical Finance where it is called Markowitz Mean-Varianc...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید