نتایج جستجو برای: expectation maximization em algorithm

تعداد نتایج: 1080815  

2010
Nicola Greggio Alexandre Bernardino José Santos-Victor

In this work we propose a clustering algorithm that learns on-line a finite gaussian mixture model from multivariate data based on the expectation maximization approach. The convergence of the right number of components as well as their means and covariances is achieved without requiring any careful initialization. Our methodology starts from a single mixture component covering the whole data s...

Journal: :Computer Vision and Image Understanding 2017
Carl-Magnus Svensson Karen Grace Bondoc Georg Pohnert Marc Thilo Figge

To solve the task of segmenting clusters of nearly identical objects we here present the template rotation expectation maximization (TREM) approach which is based on a generative model. We explore both a non-linear optimization approach for maximizing the loglikelihood and a modification of the standard expectation maximization (EM) algorithm. The non-linear approach is strict template matching...

Journal: :Statistics and Computing 1997
Marc Lavielle Eric Moulines

The Expectation-Maximization (EM) algorithm is a very popular technique for maximum likelihood estimation in incomplete data models. When the expectation step cannot be performed in closed{form, a stochastic approximation of EM (SAEM) can be used. Under very general conditions, the authors have shown that the attractive stationary points of the SAEM algorithm correspond to the global and local ...

Journal: :IEEE Trans. Automat. Contr. 2000
Charalambos D. Charalambous Andrew Logothetis

This paper is concerned with maximum likelihood (ML) parameter estimation of continuous-time nonlinear partially observed stochastic systems, via the expectation maximization (EM) algorithm. It is shown that the EM algorithm can be executed efficiently, provided the unnormalized conditional density of nonlinear filtering is either explicitly solvable or numerically implemented. The methodology ...

Journal: :Neurocomputing 2007
Ole Winther Kaare Brandt Petersen

In this paper we present an empirical Bayes method for flexible and efficient Independent Component Analysis (ICA). The method is flexible with respect to choice of source prior, dimensionality and positivity of the mixing matrix, and structure of the noise covariance matrix. The efficiency is ensured using parameter optimizers which are more advanced than the expectation maximization (EM) algo...

2007
Mingfeng Wang Masahiro Kuroda Michio Sakakihara Zhi Geng

The Expectation-Maximization (EM) algorithm is a very general and popular iterative computational algorithm to find maximum likelihood estimates from incomplete data and broadly used to statistical analysis with missing data, because of its stability, flexibility and simplicity. However, it is often criticized that the convergence of the EM algorithm is slow. The various algorithms to accelerat...

Journal: :Neurocomputing 2004
Mingjun Zhong Huanwen Tang Hongjun Chen Yiyuan Tang

An expectation-maximization (EM) algorithm for learning sparse and overcomplete representations is presented in this paper. We show that the estimation of the conditional moments of the posterior distribution can be accomplished by maximum a posteriori estimation. The approximate conditional moments enable the development of an EM algorithm for learning the overcomplete basis vectors and inferr...

2000
Mário A. T. Figueiredo Anil K. Jain

We propose a new method for fitting mixture models that performs component selection and does not require external initialization. The novelty of our approach includes: a minimum message length (MML) type model selection criterion; the inclusion of the criterion into the expectation-maximization (EM) algorithm (which also increases its ability to escape from local maxima); an initialization str...

2005
Kevin Duh

In this lecture, we will address problems 3 and 4. First, continuing from the previous lecture, we will view BaumWelch Re-estimation as an instance of the Expectation-Maximization (EM) algorithm and prove why the EM algorithm maximizes data likelihood. Then, we will proceed to discuss discriminative training under the maximum mutual information estimation (MMIE) framework. Specifically, we will...

2002
Shane M. Haas

The Expectation-Maximization (EM) algorithm is a hill-climbing approach to finding a local maximum of a likelihood function [7, 8]. The EM algorithm alternates between finding a greatest lower bound to the likelihood function (the “E Step”), and then maximizing this bound (the “M Step”). The EM algorithm belongs to a broader class of alternating minimization algorithms [6], which includes the A...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید