نتایج جستجو برای: boosting ensemble learning

تعداد نتایج: 645106  

2002
Vicent Estruch César Ferri José Hernández-Orallo M. José Ramírez-Quintana

Decision tree learning is a machine learning technique that allows us to generate accurate and comprehensible models. Accuracy can be improved by ensemble methods which combine the predictions of a set of different trees. However, a large amount of resources is necessary to generate the ensemble. In this paper, we introduce a new ensemble method that minimises the usage of resources by sharing ...

2008
Gopala Rao

Neural Network ensemble is a learning paradigm where a collection of finite number of neural networks is trained for the same task. It is understood that the generalization ability of neural networks, i.e., training many neural networks and then combining their predictions. ANN ensemble techniques have become very popular amongst neural network practitioners in a variety of ANN application doma...

1997
Steve Waterhouse Gary Cook

In this paper we investigate a number of ensemble methods for improving the performance of phoneme classiication for use in a speech recognition system. We discuss boosting and mixtures of experts, both in isolation and in combination. We present results on an isolated word database. The results show that principled ensemble methods such as boosting and mixtures provide superior performance to ...

2006
Masato OKADA

We propose a mutual learning with a latent teacher within the framework of on-line learning, and have analyzed its dynamical behavior through the statistical mechanics method. The proposed model consists of two learning steps: two students independently learn from a teacher, and then the students learn from each other through the mutual learning. A teacher is not used in the mutual learning, so...

2007
Ivor W. Tsang James T. Kwok

The training of support vector machines (SVM) involves a quadratic programming problem, which is often optimized by a complicated numerical solver. In this paper, we propose a much simpler approach based on multiplicative updates. This idea was first explored in [Cristianini et al., 1999], but its convergence is sensitive to a learning rate that has to be fixed manually. Moreover, the update ru...

Journal: :Sri lanka journal of social sciences and humanitis 2023

It is critical for physicians to correctly classify patients during a plague and determine who deserves minimal health assistance. Machine learning methods have been presented reliably forecast the severity of COVID-19 disease. Previous research has often tested different machine algorithms evaluated performance under methods. may be necessary try several combinations discover optimal predictio...

2016
Boyu Wang Joelle Pineau

While multitask learning has been extensively studied, most existing methods rely on linear models (e.g. linear regression, logistic regression), which may fail in dealing with more general (nonlinear) problems. In this paper, we present a new approach that combines dictionary learning with gradient boosting to achieve multitask learning with general (nonlinear) basis functions. Specifically, f...

2000
Thomas G. Dietterich

Ensemble methods are learning algorithms that construct a set of classi ers and then classify new data points by taking a weighted vote of their predictions The original ensemble method is Bayesian aver aging but more recent algorithms include error correcting output coding Bagging and boosting This paper reviews these methods and explains why ensembles can often perform better than any single ...

1992
Harris Drucker Robert E. Schapire Patrice Y. Simard

Patrice Simard AT &T Bell Laboratories Holmdel, NJ 07733 A boosting algorithm converts a learning machine with error rate less than 50% to one with an arbitrarily low error rate. However, the algorithm discussed here depends on having a large supply of independent training samples. We show how to circumvent this problem and generate an ensemble of learning machines whose performance in optical ...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید