نتایج جستجو برای: expectationmaximization
تعداد نتایج: 273 فیلتر نتایج به سال:
Deep latent Gaussian models are powerful and popular probabilistic models of highdimensional data. These models are almost always fit using variational expectationmaximization, an approximation to true maximum-marginal-likelihood estimation. In this paper, we propose a different approach: rather than use a variational approximation (which produces biased gradient signals), we use Markov chain M...
This paper describes an approximately expectationmaximization (EM) formulation of a homographical iterative closest point registration approach (henceforth HICP). We show that such an EM approach allows the algorithm to converge faster, and more robustly in the presence of noise. Although this algorithm can register points transformed by a more general set of linear transformations than the ori...
Point Distribution Models are useful tools for modelling the variability of particular classes of shapes. A common approach is to apply a Principle Component Analysis to the data, to reduce the dimensionality of the representation. However, a single multivariate Gaussian model of the probability density, estimated from the principle covariances, can be substantially inaccurate. In this paper, w...
In recent years, Hidden Markov Models (HMM) have been increasingly applied in data mining applications. However, most authors have used classical optimization ExpectationMaximization (EM) scheme. A new method of HMM learning based on Particle Swarm Optimization (PSO) has been developed. Along with others global approaches as Simulating Annealing (SIM) and Genetic Algorithms (GA) the following l...
In this paper, we introduce a method for estimating the statistically distinct neural responses in an sequence of functional magnetic resonance images (fMRI). The crux of our method is a technique which we call clustered component analysis (CCA). Clustered component analysis is a method for identifying the distinct component vectors in a multivariate data set. CCA is distinct from principal com...
Recent advancements in sequencing technology have made it possible to study the mechanisms of gene regulation, such as protein-DNA binding, at greater resolution and on a greater scale than was previously possible. We present an expectationmaximization learning algorithm that identifies enriched spatial relationships between motifs in sets of DNA sequences. For example, the method will identify...
Given a set of points and a set of prototypes representing them, how to create a graph of the prototypes whose topology accounts for that of the points? This problem had not yet been explored in the framework of statistical learning theory. In this work, we propose a generative model based on the Delaunay graph of the prototypes and the ExpectationMaximization algorithm to learn the parameters....
The W-S (Wake-Sleep) algorithm is a simple learning rule for the models with hidden variables. It is shown that this algorithm can be applied to a factor analysis model which is a linear version of the Helmholtz machine. But even for a factor analysis model, the general convergence is not proved theoretically. In this article, we describe the geometrical understanding of the W-S algorithm in co...
We develop a Recursive L1-Regularized Least Squares (SPARLS) algorithm for the estimation of a sparse tap-weight vector in the adaptive filtering setting. The SPARLS algorithm exploits noisy observations of the tap-weight vector output stream and produces its estimate using an ExpectationMaximization type algorithm. Simulation studies in the context of channel estimation, employing multipath wi...
In this article we consider a median variant of the learning vector quantization (LVQ) classifier for classification of dissimilarity data. However, beside the median aspect, we propose to optimize the receiver-operating characteristics (ROC) instead of the classification accuracy. In particular, we present a probabilistic LVQ model with an adaptation scheme based on a generalized ExpectationMa...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید