نتایج جستجو برای: bagging model
تعداد نتایج: 2105681 فیلتر نتایج به سال:
Bagging and boosting are two general techniques for building predictors based on small samples from a dataset. We show that boosting can be parallelized, and then present performance results for parallelized bagging and boosting using OC1 decision trees and two standard datasets. The main results are that sample sizes limit achievable accuracy, regardless of computational time spent; that paral...
Internal egg hatching in Caenorhabditis elegans, "worm bagging," is induced by exposure to bacteria. This study demonstrates that the determination of worm bagging frequency allows for advanced insight into the degree of bacterial pathogenicity and is highly predictive of the survival of worm populations. Therefore, worm bagging frequency can be regarded as a reliable population-wide stress rep...
Ensemble models—built by methods such as bagging, boosting, and Bayesian model averaging—appear dauntingly complex, yet tend to strongly outperform their component models on new data. Doesn’t this violate “Occam’s razor”—the widespread belief that “the simpler of competing alternatives is preferred”? We argue no: if complexity is measured by function rather than form—for example, according to g...
The motivation for this study was to learn to predict forest fires in Slovenia using different data mining techniques. We used predictive models based on data from a GIS (geographical information system), the weather prediction model Aladin and MODIS satellite data. We examined three different datasets: one only for the Kras region, one for whole Primorska region and one for continental Sloveni...
Bagging is a simple and effective technique for generating an ensemble of classifiers. It is found there are a lot of redundant base classifiers in the original Bagging. We design a pruning approach to bagging for improving its generalization power. The proposed technique introduces the margin distribution based classification loss as the optimization objective and minimizes the loss on trainin...
Classification and prediction of protein domain structural class is one of the important topics in the molecular biology. We introduce the Bagging (Bootstrap aggregating), one of the bootstrap methods, for classifying and predicting protein structural classes. By a bootstrap aggregating procedure, the Bagging can improve a weak classifier, for instance the random tree method, to a significant s...
Classiier committee learning approaches have demonstrated great success in increasing the prediction accuracy of classiier learning , which is a key technique for datamining. It has been shown that Boosting and Bagging, as two representative methods of this type, can signiicantly decrease the error rate of decision tree learning. Boosting is generally more accurate than Bagging, but the former ...
We propose density-ratio bagging (dragging), a semi-supervised extension of bootstrap aggregation (bagging) method. Additional unlabeled training data are used to calculate the weight on each labeled training point by a density-ratio estimator. The weight is then used to construct a weighted labeled empirical distribution, from which bags of bootstrap samples are drawn. Asymptotically, dragging...
One of the potential advantages of multiple classifier systems is an increased robustness to noise and other imperfections in data. Previous experiments on classification noise have shown that bagging is fairly robust but that boosting is quite sensitive. Decorate is a recently introduced ensemble method that constructs diverse committees using artificial data. It has been shown to generally ou...
Bagging is an ensemble method that relies on random resampling of a data set to construct models for the ensemble. When only statistics about the data are available, but no individual examples, the straightforward resampling procedure cannot be implemented. The question is then whether bagging can somehow be simulated. In this paper we propose a method that, instead of computing certain heurist...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید