نتایج جستجو برای: experts mixture

تعداد نتایج: 160772  

1995
K. Hering R. Haupt Th. Villmann

The partitioning of complex processor models on the gate and register-transfer level for parallel functional simulation based on the clock-cycle algorithm is considered. We introduce a hierarchical partitioning scheme combining various partitioning algorithms in the frame of a competing strategy. Melting together the di®erent partitioning results within one level using superpositions we crossov...

Journal: :Kybernetika 1998
Jirí Grim

Recently a new interesting architecture of neural networks called “mixture of experts” has been proposed as a tool of real multivariate approximation or classification. It is shown that, in some cases, the underlying problem of prediction can be solved by estimating the joint probability density of involved variables. Assuming the model of Gaussian mixtures we can explictly write the optimal mi...

2008
Chunping Wang Xuejun Liao Lawrence Carin David B. Dunson

A non-parametric hierarchical Bayesian framework is developed for designing a sophisticated classifier based on a mixture of simple (linear) classifiers. Each simple classifier is termed a local “expert”, and the number of experts and their construction are manifested via a Dirichlet process formulation. The simple form of the “experts” allows direct handling of incomplete data. The model is fu...

2012
Sam Mavandadi Steve Feng Frank Yu Stoyan Dimitrov Karin Nielsen-Saines William R. Prescott Aydogan Ozcan

We propose a methodology for digitally fusing diagnostic decisions made by multiple medical experts in order to improve accuracy of diagnosis. Toward this goal, we report an experimental study involving nine experts, where each one was given more than 8,000 digital microscopic images of individual human red blood cells and asked to identify malaria infected cells. The results of this experiment...

2007
Colin Fyfe Wei Chuang Ooi Hanseok Ko

We review a new form of self-organizing map which is based on a nonlinear projection of latent points into data space, identical to that performed in the Generative Topographic Mapping (GTM) [1]. But whereas the GTM is an extension of a mixture of experts, this model is an extension of a product of experts [6]. We show visualisation and clustering results on a data set composed of video data of...

Journal: :Int. J. Hybrid Intell. Syst. 2012
Reza Ebrahimpour Naser Sadeghnejad Saeed Masoudnia Seyed Ali Asghar AbbasZadeh Arani

A modified version of Boosted Mixture of Experts (BME) for low-resolution face recognition is presented in this paper. Most of the methods developed for low-resolution face recognition focused on improving the resolution of face images and/or special feature extraction methods that can deal effectively with low-resolution problem. However, we focused on the classification step of face recogniti...

2013
Rahul Kala

Machine learning and pattern recognition play a vital role in the field of biomedical engineering, where the task is to identify or classify a disease based on a set of observations. The inability of a single method to effectively solve the problem gives rise to the use multiple models for solving the same problem in a ‘Mixture of Experts’ mode. Further the data may be too large for any system ...

2013
Narayana Rao

both these information’s into a unified system, which ct to other objects. The model, spatial expert, temporal. test speed compared to that of the generative method. But most of these models are not efficient in with high dimensional data. In our work we have used Gaussian Process with both discriminative and generative framework in the process of human pose/motion estimation along with some na...

Journal: :CoRR 2017
Hien D. Nguyen Faicel Chamroukhi

Mixture-of-experts (MoE) models are a powerful paradigm for modeling of data arising from complex data generating processes (DGPs). In this article, we demonstrate how different MoE models can be constructed to approximate the underlying DGPs of arbitrary types of data. Due to the probabilistic nature of MoE models, we propose the maximum quasilikelihood (MQL) estimator as a method for estimati...

Journal: :CoRR 2014
Jun Wei Ng Marc Peter Deisenroth

We propose a practical and scalable Gaussian process model for large-scale nonlinear probabilistic regression. Our mixture-of-experts model is conceptually simple and hierarchically recombines computations for an overall approximation of a full Gaussian process. Closed-form and distributed computations allow for efficient and massive parallelisation while keeping the memory consumption small. G...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید