نتایج جستجو برای: کمیته bagging

تعداد نتایج: 4107  

Journal: :Knowl.-Based Syst. 2012
Gang Wang Jian Ma Lihua Huang Kaiquan Xu

Decision tree (DT) is one of the most popular classification algorithms in data mining and machine learning. However, the performance of DT based credit scoring model is often relatively poorer than other techniques. This is mainly due to two reasons: DT is easily affected by (1) the noise data and (2) the redundant attributes of data under the circumstance of credit scoring. In this study, we ...

Journal: :Neurocomputing 2012
Zongxia Xie Yong Xu Qinghua Hu Pengfei Zhu

Bagging is a simple and effective technique for generating an ensemble of classifiers. It is found there are a lot of redundant base classifiers in the original Bagging. We design a pruning approach to bagging for improving its generalization power. The proposed technique introduces the margin distribution based classification loss as the optimization objective and minimizes the loss on trainin...

Journal: :Journal of biomolecular structure & dynamics 2006
Liuhuan Dong Yuan Yuan Yudong Cai

Classification and prediction of protein domain structural class is one of the important topics in the molecular biology. We introduce the Bagging (Bootstrap aggregating), one of the bootstrap methods, for classifying and predicting protein structural classes. By a bootstrap aggregating procedure, the Bagging can improve a weak classifier, for instance the random tree method, to a significant s...

1998
Zijian Zheng

Classiier committee learning approaches have demonstrated great success in increasing the prediction accuracy of classiier learning , which is a key technique for datamining. It has been shown that Boosting and Bagging, as two representative methods of this type, can signiicantly decrease the error rate of decision tree learning. Boosting is generally more accurate than Bagging, but the former ...

2013
Yimin Tan Xiaojin Zhu

We propose density-ratio bagging (dragging), a semi-supervised extension of bootstrap aggregation (bagging) method. Additional unlabeled training data are used to calculate the weight on each labeled training point by a density-ratio estimator. The weight is then used to construct a weighted labeled empirical distribution, from which bags of bootstrap samples are drawn. Asymptotically, dragging...

2004
Prem Melville Nishit Shah Lilyana Mihalkova Raymond J. Mooney

One of the potential advantages of multiple classifier systems is an increased robustness to noise and other imperfections in data. Previous experiments on classification noise have shown that bagging is fairly robust but that boosting is quite sensitive. Decorate is a recently introduced ensemble method that constructs diverse committees using artificial data. It has been shown to generally ou...

2006
Anneleen Van Assche Hendrik Blockeel

Bagging is an ensemble method that relies on random resampling of a data set to construct models for the ensemble. When only statistics about the data are available, but no individual examples, the straightforward resampling procedure cannot be implemented. The question is then whether bagging can somehow be simulated. In this paper we propose a method that, instead of computing certain heurist...

2000
Alexey Tsymbal Seppo Puuronen

One approach in classification tasks is to use machine learning techniques to derive classifiers using learning instances. The cooperation of several base classifiers as a decision committee has succeeded to reduce classification error. The main current decision committee learning approaches boosting and bagging use resampling with the training set and they can be used with different machine le...

2014
Periklis A. Papakonstantinou Jia Xu Zhu Cao

Bagging (Breiman 1996) and its variants is one of the most popular methods in aggregating classifiers and regressors. Originally, its analysis assumed that the bootstraps are built from an unlimited, independent source of samples, therefore we call this form of bagging ideal-bagging. However in the real world, base predictors are trained on data subsampled from a limited number of training samp...

2003
Song Xi Chen Peter Hall

Bagging an estimator approximately doubles its bias through the impact of bagging on quadratic terms in expansions of the estimator. This difficulty can be alleviated by bagging a suitably bias-corrected estimator, however. In these and other circumstances, what is the overall impact of bagging and/or bias correction, and how can it be characterised? We answer these questions in the case of gen...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید