نتایج جستجو برای: experts mixture

تعداد نتایج: 160772  

2008
Arasanathan Thayananthan

The aim of this report is to detail the implementation of a sparse Bayesian Mixture of Experts (ME) [2] for solving a one-to-many regression mapping based on the relevance vector machine architecture. Our eventual goal is to evaluate the ME framework in human body and hand pose estimation from monocular view. However, this is left for future work. The application of ME is demonstrated using a t...

Journal: :CoRR 2018
Ashok Vardhan Makkuva Sreeram Kannan Pramod Viswanath

Mixture-of-Experts (MoE) is a widely popular neural network architecture and is a basic building block of highly successful modern neural networks, for example, Gated Recurrent Units (GRU) and Attention networks. However, despite the empirical success, finding an efficient and provably consistent algorithm to learn the parameters remains a long standing open problem for more than two decades. I...

2006
Charles A. Sutton Michael Sindelar Andrew McCallum

Discriminative probabilistic models are very popular in NLP because of the latitude they afford in designing features. But training involves complex trade-offs among weights, which can be dangerous: a few highlyindicative features can swamp the contribution of many individually weaker features, causing their weights to be undertrained. Such a model is less robust, for the highly-indicative feat...

2003
S. S. Airey Mark J. F. Gales

Distributed representations allow the effective number of Gaussian components in a mixture model, or state of an HMM, to be increased without dramatically increasing the number of model parameters. Various forms of distributed representation have previously been investigated. In this work it shown that the product of experts (PoE) framework may be viewed as a distributed representation when the...

Journal: :Computer Speech & Language 2006
Mark J. F. Gales S. S. Airey

Recently there has been interest in the use of classifiers based on the product of experts (PoE) framework. PoEs offer an alternative to the standard mixture of experts (MoE) framework. It may be viewed as examining the intersection of a series of experts, rather than the union as in the MoE framework. This paper presents a particular implementation of PoEs, the normalised product of Gaussians ...

Journal: :IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics) 1997

Journal: :IEICE Transactions on Information and Systems 2012

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید