نتایج جستجو برای: boosting ensemble learning

تعداد نتایج: 645106  

Journal: :Journal of Machine Learning Research 2004
Nitesh V. Chawla Lawrence O. Hall Kevin W. Bowyer W. Philip Kegelmeyer

Bagging and boosting are two popular ensemble methods that typically achieve better accuracy than a single classifier. These techniques have limitations on massive datasets, as the size of the dataset can be a bottleneck. Voting many classifiers built on small subsets of data (“pasting small votes”) is a promising approach for learning from massive datasets, one that can utilize the power of bo...

2006
Zhi-Gang Fan Bao-Liang Lu

In this paper, we propose a novel learning method for face detection using discriminative feature selection. The main deficiency of the boosting algorithm for face detection is its long training time. Through statistical learning theory, our discriminative feature selection method can make the training process for face detection much faster than the boosting algorithm without degrading the gene...

2010
Jakkrit TeCho Cholwich Nattee Thanaruk Theeramunkong

A boosting-based ensemble learning can be used to improve classification accuracy by using multiple classification models constructing to cope with errors obtained from preceding steps. This paper presents an application of the boosting-based ensemble learning with penalty setting profiles on automatic unknown word recognition in Thai. Treating a sequential task as a non-sequential problem requ...

2009
Lior Rokach

The idea of ensemble methodology is to build a predictive model by integrating multiple models. It is well-known that ensemble methods can be used for improving prediction performance. In this chapter we provide an overview of ensemble methods in classification tasks. We present all important types of ensemble methods including boosting and bagging. Combining methods and modeling issues such as...

2011
Parinaz Sobhani Hamid Beigy

Classification of data streams has become an important area of data mining, as the number of applications facing these challenges increases. In this paper, we propose a new ensemble learning method for data stream classification in presence of concept drift. Our method is capable of detecting changes and adapting to new concepts which appears in the stream. Data stream classification; concept d...

2009
Dong-Sheng Cao Qing-Song Xu Yi-Zeng Liang Liang-Xiao Zhang Hong-Dong Li

a r t i c l e i n f o The idea of boosting deeply roots in our daily life practice, which constructs the general aspects of how to think about chemical problems and how to build chemical models. In mathematics, boosting is an iterative reweighting procedure by sequentially applying a base learner to reweighted versions of the training data whose current weights are modified based on how accurat...

2004
Yang Liu Elizabeth Shriberg Andreas Stolcke Mary P. Harper

We investigate machine learning techniques for coping with highly skewed class distributions in two spontaneous speech processing tasks. Both tasks, sentence boundary and disfluency detection, provide important structural information for downstream language processing modules. We examine the effect of data set size, task, sampling method (no sampling, downsampling, oversampling, and ensemble sa...

Journal: :journal of advances in computer research 0
mohammad mohammadi department of computer engineering, nourabad mamasani branch, islamic azad university, nourabad mamasani, iran hamid parvin department of computer engineering, nourabad mamasani branch, islamic azad university, nourabad mamasani, iran eshagh faraji department of computer engineering, nourabad mamasani branch, islamic azad university, nourabad mamasani, iran sajad parvin department of computer engineering, nourabad mamasani branch, islamic azad university, nourabad mamasani, iran

the article suggests an algorithm for regular classifier ensemble methodology. the proposed methodology is based on possibilistic aggregation to classify samples. the argued method optimizes an objective function that combines environment recognition, multi-criteria aggregation term and a learning term. the optimization aims at learning backgrounds as solid clusters in subspaces of the high-dim...

2004
S. B. Kotsiantis P. E. Pintelas

Bagging and boosting are among the most popular resampling ensemble methods that generate and combine a diversity of classifiers using the same learning algorithm for the base-classifiers. Boosting algorithms are considered stronger than bagging on noisefree data. However, there are strong empirical indications that bagging is much more robust than boosting in noisy settings. For this reason, i...

2005
Andreas Heß Rinat Khoussainov Nicholas Kushmerick

We propose a novel ensemble learning algorithm called Triskel, which has two interesting features. First, Triskel learns an ensemble of classifiers that are biased to have high precision (as opposed to, for example, boosting, where the ensemble members are biased to ignore portions of the instance space). Second, Triskel uses weighted voting like most ensemble methods, but the weights are assig...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید