نتایج جستجو برای: boosting ensemble learning
تعداد نتایج: 645106 فیلتر نتایج به سال:
Input/output hidden Markov model (IOHMM) has turned out to be effective in sequential data processing via supervised learning. However, there are several difficulties, e.g. model selection, unexpected local optima and high computational complexity, which hinder an IOHMM from yielding the satisfactory performance in sequence classification. Unlike previous efforts, this paper presents an ensembl...
A combination of classification rules (classifiers) is known as an Ensemble, and in general it is more accurate than the individual classifiers used to build it. Two popular methods to construct an Ensemble are Bagging (Bootstrap aggregating) introduced by Breiman, [4] and Boosting (Freund and Schapire, [11]). Both methods rely on resampling techniques to obtain different training sets for each...
Learning a function of many arguments is viewed from the perspective of high– dimensional numerical quadrature. It is shown that many of the popular ensemble learning procedures can be cast in this framework. In particular randomized methods, including bagging and random forests, are seen to correspond to random Monte Carlo integration methods each based on particular importance sampling strate...
Classifier fusion strategies have shown great potential to enhance the performance of pattern recognition systems. There is an agreement among researchers in classifier combination that the major factor for producing better accuracy is the diversity in the classifier team. Re-sampling based approaches like bagging, boosting and random subspace generate multiple models by training a single learn...
The knowledge discovery process encounters the difficulties to analyze large amount of data. Indeed, some theoretical problems related to high dimensional spaces then appear and degrade the predictive capacity of algorithms. In this paper, we propose a new methodology to get a better representation and prediction of huge datasets. For that purpose, an ensemble approach is used to overcome probl...
Credit scoring is an important finance activity. Both statistical techniques and Artificial Intelligence (AI) techniques have been explored for this topic. But different techniques have different advantages and disadvantages on different datasets. Recent studies draw no consistent conclusions to show that one technique is superior to the other, while they suggest combining multiple classifiers,...
In this paper, we propose to use classifier ensemble (CE) as a method to enhance the robustness of machine learning (ML) kernels in presence of hardware error. Different ensemble methods (Bagging and Adaboost) are explored with decision tree (C4.5) and artificial neural network (ANN) as base classifiers. Simulation results show that ANN is inherently tolerant to hardware errors with up to 10% h...
In real world there are many examples where synergetic cooperation of multiple entities performs better than just single one. The same fundamental idea can be found in ensemble learning methods that have the ability to improve classification accuracy. Each classifier has specific view on the problem domain and can produce different classification for the same observed sample. Therefore many met...
Previously, we have proposed to use complementary complexity measures discovered by boosting-like ensemble learning for the enhancement of quantitative indicators dealing with necessarily short physiological time series. We have confirmed robustness of such multi-complexity measures for heart rate variability analysis with the emphasis on detection of emerging and intermittent cardiac abnormali...
In this paper, the application of multiple Elman neural networks to time series data regression problems is studied. An ensemble of Elman networks is formed by boosting to enhance the performance of the individual networks. A modified version of the AdaBoost algorithm is employed to integrate the predictions from multiple networks. Two benchmark time series data sets, i.e., the Sunspot and Box-...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید