نتایج جستجو برای: weak classifiers
تعداد نتایج: 165027 فیلتر نتایج به سال:
Abstract: In this paper we propose a bayesian classification rule in a context where the training sample is not complete. The input X is observable, but the output Y is not, and we only utilize the result of weak classifiers instead of Y . Our method is motivated by epidemiological and bioinformatics problems. We prove the consistency of our method and illustrate its efficiency using simulation...
Finding disease markers (classifiers) from gene expression data by machine learning algorithms is characterized by a high risk of overfitting the data due the abundance of attributes (simultaneously measured gene expression values) and shortage of available examples (observations). To avoid this pitfall and achieve predictor robustness, state-of-the-art approaches construct complex classifiers ...
In this paper we propose a novel multiclass classifier called the probabilistic linear machine (PLM) which overcomes the low-entropy problem of exponential-based classifiers. Although PLMs are linear classifiers, we use a careful design of the parameters matched with weak requirements over the features to output a true probability distribution over labels given an input instance. We cast the di...
Much current research is undertaken into combining classifiers to increase the classification accuracy. We show, by means of an enumerative example, how combining classifiers can lead to much greater or lesser accuracy than each individual classifier. Measures of diversity among the classifiers taken from the literature are shown to only exhibit a weak relationship with majority vote accuracy. ...
Boosting combines weak classifiers to form highly accurate predictors. Although the case of binary classification is well understood, in the multiclass setting, the “correct” requirements on the weak classifier, or the notion of the most efficient boosting algorithms are missing. In this paper, we create a broad and general framework, within which we make precise and identify the optimal requir...
A practical and useful notion of weak dependence between many classifiers constructed with the same training data is introduced. It is shown that when (a) this weak dependence is rather low, and (b) the expected margins are large, exponential bounds on the true error rates can be achieved. Empirical results with randomized trees, and trees constructed via boosting and adaptive bagging, show tha...
In this paper we propose a novel multiclass classifier called the probabilistic linear machine (PLM) which overcomes the low-entropy problem of exponential-based classifiers. Although PLMs are linear classifiers, we use a careful design of the parameters matched with weak requirements over the features to output a true probability distribution over labels given an input instance. We cast the di...
Much current research is undertaken into combining classifiers to increase the classification accuracy. We show, by means of an enumerative example, how combining classifiers can lead to much greater or lesser accuracy than each individual classifier. Measures of diversity among the classifiers taken from the literature are shown to only exhibit a weak relationship with majority vote accuracy. ...
This paper presents an approach with ensemble classifiers using unsupervised data selection for speaker recognition. Ensemble learning is a type of machine learning that applies a combination of several weak learners to achieve an improved performance than a single learner. Based on its acoustic characteristics, the speech utterance is divided into several subsets using unsupervised data select...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید