نتایج جستجو برای: experts mixture

تعداد نتایج: 160772  

Journal: :Journal of Statistical Distributions and Applications 2021

Mixture of experts (MoE) models are widely applied for conditional probability density estimation problems. We demonstrate the richness class MoE by proving denseness results in Lebesgue spaces, when inputs and outputs variables both compactly supported. further prove an almost uniform convergence result input is univariate. Auxiliary lemmas proved regarding soft-max gating function class, thei...

2007
Ahmed Rida Abderrahim Labbi Christian Pellegrini

This paper is concerned with an important issue in Statistics and Artiicial Intelligence, which is problem decomposition and experts (or predictors) combination. Decomposition methods usually adopt a divide-and-conquer strategy which decomposes the initial problem into simple sub-problems. The global expert is then obtained from some combination of the local experts. In the case of Hard decompo...

2005
Zhenchun Lei Yingchun Yang Zhaohui Wu

In this paper, the mixture of support vector machines is proposed and applied to text-independent speaker recognition. The mixture of experts is used and is implemented by the divide-and-conquer approach. The purpose of adopting this idea is to deal with the large scale speech data and improve the performance of speaker recognition. The principle is to train several parallel SVMs on the subsets...

Journal: :CoRR 2015
Faicel Chamroukhi

Abstract Mixture of Experts (MoE) is a popular framework for modeling heterogeneity in data for regression, classification and clustering. For continuous data which we consider here in the context of regression and cluster analysis, MoE usually use normal experts, that is, expert components following the Gaussian distribution. However, for a set of data containing a group or groups of observati...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید