نتایج جستجو برای: experts mixture

تعداد نتایج: 160772  

Journal: :Neural Networks 1995
Michael I. Jordan Lei Xu

The Expectation-Maximization (EM) algorithm is an iterative approach to maximum likelihood parameter estimation. Jordan and Jacobs (1994) recently proposed an EM algorithm for the mixture of experts architecture of Jacobs, Jordan, Nowlan and Hinton (1991) and the hierarchical mixture of experts architecture of Jordan and Jacobs (1992). They showed empirically that the EM algorithm for these arc...

1998
Amir Karniel Ron Meir Gideon F. Inbar

Feed-forward control schemes require an inverse mapping of the controlled system. In adaptive systems as well as in biological modeling this inverse mapping is learned from examples. The biological motor control is very redundant, as are many robotic systems, implying that the inverse problem is ill posed. In this work a new architecture and algorithm for learning multiple inverses is proposed,...

2007
J. Peres R. Oliveira S. Feyo de Azevedo

This paper presents a novel method for bioprocess hybrid parametric/nonparametric modelling based on mixture of experts (ME) and the Expectation Maximisation (EM) algorithm. The bioreactor system is described by material balance equations whereas the cell population subsystem is described by an adjustable mixture of parametric/nonparametric sub-models inspired in the ME architecture. This idea ...

Journal: :IEEE transactions on neural networks 2000
Srinivas Gutta Jeffrey Huang P. Jonathon Phillips Harry Wechsler

In this paper we describe the application of mixtures of experts on gender and ethnic classification of human faces, and pose classification, and show their feasibility on the FERET database of facial images. The FERET database allows us to demonstrate performance on hundreds or thousands of images. The mixture of experts is implemented using the "divide and conquer" modularity principle with r...

2017
Sonali Parbhoo Jasmina Bogojeska Maurizio Zazzi Volker Roth Finale Doshi-Velez

We present a mixture-of-experts approach for HIV therapy selection. The heterogeneity in patient data makes it difficult for one particular model to succeed at providing suitable therapy predictions for all patients. An appropriate means for addressing this heterogeneity is through combining kernel and model-based techniques. These methods capture different kinds of information: kernel-based me...

Journal: :Neural networks : the official journal of the International Neural Network Society 2003
Keisuke Yamazaki Sumio Watanabe

A learning machine which is a mixture of several distributions, for example, a gaussian mixture or a mixture of experts, has a wide range of applications. However, such a machine is a non-identifiable statistical model with a lot of singularities in the parameter space, hence its generalization property is left unknown. Recently an algebraic geometrical method has been developed which enables u...

1997
Lei Xu

In this paper we extend Bayesian Kullback YING YANG BKYY learning into a much broader Bayesian Ying Yang BYY learning System via using di erent sep aration functionals instead of using only Kullback Diver gence and elaborate the power of BYY learning as a gen eral learning theory for parameter learning scale selection structure evaluation regularization and sampling design with its relations to...

2007
Reza Ebrahimpour Ehsanollah Kabir Mohammad Reza Yousefi

We propose two new models for view-independent face recognition, which lies under the category of multiview approaches. We use the so-called “mixture of experts” (MOE) in which, the problem space is divided into several subspaces for the experts, and then the outputs of experts are combined by a gating network to form the final output. Basically, our focus is on the way that the face space is p...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید