نتایج جستجو برای: expectationmaximization

تعداد نتایج: 273  

2017
Matthew D. Hoffman

Deep latent Gaussian models are powerful and popular probabilistic models of highdimensional data. These models are almost always fit using variational expectationmaximization, an approximation to true maximum-marginal-likelihood estimation. In this paper, we propose a different approach: rather than use a variational approximation (which produces biased gradient signals), we use Markov chain M...

2005
Markus Louw Fred Nicolls

This paper describes an approximately expectationmaximization (EM) formulation of a homographical iterative closest point registration approach (henceforth HICP). We show that such an EM approach allows the algorithm to converge faster, and more robustly in the presence of noise. Although this algorithm can register points transformed by a more general set of linear transformations than the ori...

2000
James Orwell Darrel Greenhill Jonathan D. Rymel Graeme A. Jones

Point Distribution Models are useful tools for modelling the variability of particular classes of shapes. A common approach is to apply a Principle Component Analysis to the data, to reduce the dimensionality of the representation. However, a single multivariate Gaussian model of the probability density, estimated from the principle covariances, can be substantially inaccurate. In this paper, w...

2008
D. Novák

In recent years, Hidden Markov Models (HMM) have been increasingly applied in data mining applications. However, most authors have used classical optimization ExpectationMaximization (EM) scheme. A new method of HMM learning based on Particle Swarm Optimization (PSO) has been developed. Along with others global approaches as Simulating Annealing (SIM) and Genetic Algorithms (GA) the following l...

2000
Charles A. Bouman Sea Chen Mark J. Lowe

In this paper, we introduce a method for estimating the statistically distinct neural responses in an sequence of functional magnetic resonance images (fMRI). The crux of our method is a technique which we call clustered component analysis (CCA). Clustered component analysis is a method for identifying the distinct component vectors in a multivariate data set. CCA is distinct from principal com...

2013
David K. Gifford Shaun Mahony Chris Reeder Matt Edwards Yuchun Guo Jeanne Darling

Recent advancements in sequencing technology have made it possible to study the mechanisms of gene regulation, such as protein-DNA binding, at greater resolution and on a greater scale than was previously possible. We present an expectationmaximization learning algorithm that identifies enriched spatial relationships between motifs in sets of DNA sequences. For example, the method will identify...

2005
Michaël Aupetit

Given a set of points and a set of prototypes representing them, how to create a graph of the prototypes whose topology accounts for that of the points? This problem had not yet been explored in the framework of statistical learning theory. In this work, we propose a generative model based on the Delaunay graph of the prototypes and the ExpectationMaximization algorithm to learn the parameters....

1998
Shiro Ikeda Shun-ichi Amari Hiroyuki Nakahara

The W-S (Wake-Sleep) algorithm is a simple learning rule for the models with hidden variables. It is shown that this algorithm can be applied to a factor analysis model which is a linear version of the Helmholtz machine. But even for a factor analysis model, the general convergence is not proved theoretically. In this article, we describe the geometrical understanding of the W-S algorithm in co...

Journal: :CoRR 2009
Behtash Babadi Nicholas Kalouptsidis Vahid Tarokh

We develop a Recursive L1-Regularized Least Squares (SPARLS) algorithm for the estimation of a sparse tap-weight vector in the adaptive filtering setting. The SPARLS algorithm exploits noisy observations of the tap-weight vector output stream and produces its estimate using an ExpectationMaximization type algorithm. Simulation studies in the context of channel estimation, employing multipath wi...

2015
D. Nebel T. Villmann

In this article we consider a median variant of the learning vector quantization (LVQ) classifier for classification of dissimilarity data. However, beside the median aspect, we propose to optimize the receiver-operating characteristics (ROC) instead of the classification accuracy. In particular, we present a probabilistic LVQ model with an adaptation scheme based on a generalized ExpectationMa...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید