نتایج جستجو برای: کمیته bagging

تعداد نتایج: 4107  

2006
Vishakh

Machine Learning tools are increasingly being applied to analyze data from microarray experiments. These include ensemble methods where weighted votes of constructed base classifiers are used to classify data. We compare the performance of AdaBoost, bagging and BagBoost on gene expression data from the yeast cell cycle. AdaBoost was found to be more effective for the data than bagging. BagBoost...

2008
Dimitris N. Politis

The problem of large-scale simultaneous hypothesis testing is revisited. Bagging and subagging procedures are put forth with the purpose of improving the discovery power of the tests. The procedures are implemented in both simulated and real data. It is shown that bagging and subagging significantly improve power at the cost of a small increase in false discovery rate with the proposed ‘maximum...

2017
Eric Nalisnick Padhraic Smyth

We use amortized inference in conjunction with implicit models to approximate the bootstrap distribution over model parameters. We call this the amortized bootstrap, as statistical strength is shared across dataset replicates through a metamodel. At test time, we can then perform amortized bagging by drawing multiple samples from the implicit model. We find amortized bagging outperforms bagging...

1996
David H. Wolpert William G. Macready

In bagging Bre94a] one uses bootstrap replicates of the training set Efr79, ET93] to improve a learning algorithm's performance, often by tens of percent. This paper presents several ways that stacking Wol92b, Bre92] can be used in concert with the bootstrap procedure to achieve a further improvement on the performance of bagging for some regression problems. In particular, in some of the work ...

2015
David J. Dittman Taghi M. Khoshgoftaar Amri Napolitano

Ensemble learning (process of combining multiple models into a single decision) is an effective tool for improving the classification performance of inductive models. While ideal for domains like bioinformatics with many challenging datasets, many ensemble methods, such as Bagging and Boosting, do not take into account the high-dimensionality (large number of features per instance) that is comm...

2000
Andreas Buja Werner Stuetzle

Bagging is a device intended for reducing the prediction error of learning algorithms. In its simplest form, bagging draws bootstrap samples from the training sample, applies the learning algorithm to each bootstrap sample, and then averages the resulting prediction rules. Heuristically, the averaging process should reduce the variance component of the prediction error. This is supported by emp...

2002
Tomaso Poggio Ryan Rifkin Sayan Mukherjee Alex Rakhlin

Intuitively, we expect that averaging — or bagging — different regressors with low correlation should smooth their behavior and be somewhat similar to regularization. In this note we make this intuition precise. Using an almost classical definition of stability, we prove that a certain form of averaging provides generalization bounds with a rate of convergence of the same order as Tikhonov regu...

2006
Ian Davidson Wei Fan

Bayesian model averaging also known as the Bayes optimal classifier (BOC) is an ensemble technique used extensively in the statistics literature. However, compared to other ensemble techniques such as bagging and boosting, BOC is less known and rarely used in data mining. This is partly due to model averaging being perceived as being inefficient and because bagging and boosting consistently out...

Journal: :EURASIP J. Audio, Speech and Music Processing 2011
Christos Dimitrakakis Samy Bengio

We address the question of whether and how boosting and bagging can be used for speech recognition. In order to do this, we compare two different boosting schemes, one at the phoneme level, and one at the utterance level, with a phoneme level bagging scheme. We control for many parameters and other choices, such as the state inference scheme used. In an unbiased experiment, we clearly show that...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید