نتایج جستجو برای: maximum a posteriori estimation
تعداد نتایج: 13536160 فیلتر نتایج به سال:
For Bayesian inference on the mixture of factor analyzers, natural conjugate priors on the parameters are introduced, and then a Gibbs sampler that generates parameter samples following the posterior is constructed. In addition, a deterministic estimation algorithm is derived by taking modes instead of samples from the conditional posteriors used in the Gibbs sampler. This is regarded as a maxi...
In this paper, we examine the Ambrosio-Tortorelli (AT) functional [1] for image segmentation from an estimation theoretical point of view. Instead of considering a single point estimate, i.e. the maximum-a-posteriori (MAP) estimate, we adopt a wider estimation theoretical view-point, meaning we consider images to be random variables and investigate their distribution. We derive an effective blo...
This primer presents parameter estimation methods common in Bayesian statistics and apply them to discrete probability distributions, which commonly occur in text modeling. Presentation starts with maximum likelihood and a posteriori estimation approaches and the full Bayesian approach. This presentation is completed by an overview of Bayesian networks, a graphical language to express probabili...
This paper presents a multiuser maximum a posteriori (MAP) decoder for synchronous code division multiple access (CDMA) signals on Rayleigh at-fading channels. The receiver does not assume perfect channel information at the receiver. The key idea is to expand the decoding trellis for the purpose of joint channel estimation and decoding. The time varying channel eeect is equalized by coupling mi...
Joint parameter and state estimation is proposed for linear state-space model with uniform, entry-wise correlated, state and output noises (LSU model for short). The adopted Bayesian modelling and approximate estimation produce an estimator that (a) provides the maximum a posteriori estimate enriched by information on its precision, (b) respects correlated noise entries without demanding the us...
For machine learning of an input-output function f from examples, we show it is possible to define an a priori probability density function on the hypothesis space to represent knowledge of the probability distribution of f , even when the hypothesis space H is large (i.e., nonparametric). This allows extension of maximum a posteriori (MAP) estimation methods nonparametric function estimation. ...
Abstract—An expectation-maximization (EM) algorithm for independent component analysis in the presence of gaussian noise is presented. The estimation of the conditional moments of the source posterior can be accomplished by maximum a posteriori estimation. The approximate conditional moments enable the development of an EM algorithm for inferring the most probable sources and learning the param...
An expectation-maximization (EM) algorithm for learning sparse and overcomplete representations is presented in this paper. We show that the estimation of the conditional moments of the posterior distribution can be accomplished by maximum a posteriori estimation. The approximate conditional moments enable the development of an EM algorithm for learning the overcomplete basis vectors and inferr...
We present an algorithm for color classification with explicit illuminant estimation and compensation. A Gaussian classifier is trained with color samples from just one training image. Then, using a simple diagonal illumination model, the illuminants in a new scene that contains some of the same surface classes are estimated in a Maximum Likelihood framework using the Expectation Maximization a...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید