نتایج جستجو برای: experts mixture

تعداد نتایج: 160772  

2013
Francisco A.A. Souza Rui Araújo Francisco A. A. Souza

This paper addresses the problem of online quality prediction in processes with multiple operating modes. The paper proposes a new method called mixture of partial least squares regression (Mix-PLS), where the solution of the mixture of experts regression is performed using the partial least squares (PLS) algorithm. The PLS is used to tune the model experts and the gate parameters. The solution...

Journal: :Computational Statistics & Data Analysis 2021

Abstract Mixture of linear experts (MoE) model is one the widespread statistical frameworks for modeling, classification, and clustering data. Built on normality assumption error terms mathematical computational convenience, classical MoE has two challenges: (1) it sensitive to atypical observations outliers, (2) might produce misleading inferential results censored The aim then resolve these c...

2016
Abhinav Valada Ankit Dhall Wolfram Burgard

Robust scene understanding of outdoor environments using passive optical sensors is a critical problem characterized by changing conditions throughout the day and across seasons. The perception models on a robot should be able learn features impervious to these factors in order to be operable in the real-world. In this paper, we propose a convoluted mixture of deep experts (CMoDE) model that en...

Journal: :IEEE transactions on systems, man, and cybernetics. Part B, Cybernetics : a publication of the IEEE Systems, Man, and Cybernetics Society 1997
Wassim S. Chaer Robert H. Bishop Joydeep Ghosh

This paper proposes a modular and flexible approach to adaptive Kalman filtering using the framework of a mixture-of-experts regulated by a gating network. Each expert is a Kalman filter modeled with a different realization of the unknown system parameters such as process and measurement noise. The gating network performs on-line adaptation of the weights given to individual filter estimates ba...

1998
Craig L. Fancourt Jose C. Principe

The Mixture of Experts, as it was originally formulated, is a static algorithm in the sense that the output of the network, and parameter updates during training, are completely independent from one time step to the next. This independence creates difficulties when the model is applied to time series prediction. We address this by adding memory to the Mixture of Experts. A Gaussian assumption o...

1997
Ajit V. Rao David J. Miller Kenneth Rose Allen Gersho

A new and e ective design method is presented for statistical regression functions that belong to the class of mixture models. The class includes the hierarchical mixture of experts (HME) and the normalized radial basis functions (NRBF). Design algorithms based on the maximum likelihood (ML) approach, which emphasize a probabilistic description of the model, have attracted much interest in HME ...

2005
S. MEENAKSHISUNDARAM S. S. DLAY W. L. WOO

-In this paper, we introduce a new classification kernel by embedding self organized map (SOM) clustering with mixture of radial basis function (RBF) networks. The model’s efficacy is demonstrated in solving a multi-class TIMIT speech recognition problem where the kernel is used to learn the multidimensional cepstral feature vectors to estimate their posterior class probabilities. The tests res...

2011
Derya Ozkan Louis-Philippe Morency

In many computational linguistic scenarios, training labels are subjectives making it necessary to acquire the opinions of multiple annotators/experts, which is referred to as ”wisdom of crowds”. In this paper, we propose a new approach for modeling wisdom of crowds based on the Latent Mixture of Discriminative Experts (LMDE) model that can automatically learn the prototypical patterns and hidd...

2017
O. Fatih Kilic M. Omer Sayin Suleyman S. Kozat

We introduce a new combination approach for the mixture of adaptive filters based on the set-membership filtering (SMF) framework. We perform SMF to combine the outputs of several parallel running adaptive algorithms and propose unconstrained, affinely constrained and convexly constrained combination weight configurations. Here, we achieve better trade-off in terms of the transient and steady-s...

2003
Mayte Suárez-Fariñas Carlos Eduardo Pedreira

In this paper we investigate mixture of experts problems in the context of Local-Global Neural Networks. This type of architecture was originaly conceived for functional approximation and interpolation problems. Numerical experiments are presented, showing quite nice solutions. Because of its local characteristics, this type of approach brings the advantage of improving interpretability.

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید