نتایج جستجو برای: boosting and bagging strategies

تعداد نتایج: 16865484  

1997
Harris Drucker

In the regression context, boosting and bagging are techniques to build a committee of regressors that may be superior to a single regressor. We use regression trees as fundamental building blocks in bagging committee machines and boosting committee machines. Performance is analyzed on three non-linear functions and the Boston housing database. In all cases, boosting is at least equivalent, and...

1998
Zijian Zheng

Classiier committee learning approaches have demonstrated great success in increasing the prediction accuracy of classiier learning , which is a key technique for datamining. These approaches generate several classiiers to form a committee by repeated application of a single base learning algorithm. The committee members vote to decide the nal classiication. It has been shown that Boosting and ...

2013
Jungeun Kwon Keunho Choi Yongmoo Suh

Several rating agencies such as Standard & Poor's (S&P), Moody's and Fitch Ratings have evaluated firms’ credit rating. Since lots of fees are required by the agencies and sometimes the timely default risk of the firms is not reflected, it can be helpful for stakeholders if the credit ratings can be predicted before the agencies publish them. However, it is not easy to make an accurate predicti...

Journal: :EURASIP J. Audio, Speech and Music Processing 2011
Christos Dimitrakakis Samy Bengio

We address the question of whether and how boosting and bagging can be used for speech recognition. In order to do this, we compare two different boosting schemes, one at the phoneme level, and one at the utterance level, with a phoneme level bagging scheme. We control for many parameters and other choices, such as the state inference scheme used. In an unbiased experiment, we clearly show that...

2006
Ian Davidson Wei Fan

Bayesian model averaging also known as the Bayes optimal classifier (BOC) is an ensemble technique used extensively in the statistics literature. However, compared to other ensemble techniques such as bagging and boosting, BOC is less known and rarely used in data mining. This is partly due to model averaging being perceived as being inefficient and because bagging and boosting consistently out...

2004
Yang Liu Elizabeth Shriberg Andreas Stolcke Mary P. Harper

We investigate machine learning techniques for coping with highly skewed class distributions in two spontaneous speech processing tasks. Both tasks, sentence boundary and disfluency detection, provide important structural information for downstream language processing modules. We examine the effect of data set size, task, sampling method (no sampling, downsampling, oversampling, and ensemble sa...

2012
Sotiris Kotsiantis Dimitris Kanellopoulos D. KANELLOPOULOS

Bagging, boosting and random subspace methods are well known re-sampling ensemble methods that generate and combine a diversity of learners using the same learning algorithm for the base-regressor. In this work, we built an ensemble of bagging, boosting and random subspace methods ensembles with 8 sub-regressors in each one and then an averaging methodology is used for the final prediction. We ...

2015
David J. Dittman Taghi M. Khoshgoftaar Amri Napolitano

Ensemble learning (process of combining multiple models into a single decision) is an effective tool for improving the classification performance of inductive models. While ideal for domains like bioinformatics with many challenging datasets, many ensemble methods, such as Bagging and Boosting, do not take into account the high-dimensionality (large number of features per instance) that is comm...

1997
IAN H. WITTEN

Ensembles of decision trees often exhibit greater predictive accuracy than single trees alone. Bagging and boosting are two standard ways of generating and combining multiple trees. Boosting has been empirically determined to be the more eeective of the two, and it has recently been proposed that this may be because it produces more diverse trees than bagging. This paper reports empirical nding...

1998
Zijian Zheng

Boosting and Bagging, as two representative approaches to learning classiier committees, have demonstrated great success, especially for decision tree learning. They repeatedly build diierent classiiers using a base learning algorithm by changing the distribution of the training set. Sasc, as a diierent type of committee learning method, can also signiicantly reduce the error rate of decision t...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید