نتایج جستجو برای: mixture probability model

تعداد نتایج: 2320040  

2006
Jorge Civera Alfons Juan

Mixture modelling is a standard pattern classification technique. However, in statistical machine translation, the use of mixture modelling is still unexplored. Two main advantages of the mixture approach are first, its flexibility to find an appropriate tradeoff between model complexity and the amount of training data available and second, its capability to learn specific probability distribut...

2016
Parthan Kasarapu

The modelling of empirically observed data is commonly done using mixtures of probability distributions. In order to model angular data, directional probability distributions such as the bivariate von Mises (BVM) is typically used. The critical task involved in mixture modelling is to determine the optimal number of component probability distributions. We employ the Bayesian information-theoret...

Journal: :JCP 2013
Taisong Xiong Jianping Gou Yunbo Rao

A new Student’s t-distribution finite mixture model is proposed which incorporates the local spatial information of the pixels. The pixels’ label probability proportions are explicitly modelled as probability vectors in the proposed model. We use the gradient descend method to estimate the parameters of the proposed model. Comprehensive experiments are performed for synthetic and natural graysc...

2013
Kassandra Fronczyk Athanasios Kottas

We develop a Bayesian nonparametric mixture modeling framework for replicated count responses in dose-response settings. We explore this methodology for modeling and risk assessment in developmental toxicity studies, where the primary objective is to determine the relationship between the level of exposure to a toxic chemical and the probability of a physiological or biochemical response, or de...

Journal: :Journal of animal breeding and genetics = Zeitschrift fur Tierzuchtung und Zuchtungsbiologie 2005
Y Liu Z B Zeng

Marker-assisted genetic evaluation needs to infer genotypes at quantitative trait loci (QTL) based on the information of linked markers. As the inference usually provides the probability distribution of QTL genotypes rather than a specific genotype, marker-assisted genetic evaluation is characterized by the mixture model because of the uncertainty of QTL genotypes. It is, therefore, necessary t...

Journal: :Journal of the Optical Society of America. A, Optics, image science, and vision 2014
Eric X Wang Svetlana Avramov-Zamurovic Richard J Watkins Charles Nelson Reza Malek-Madani

A method for probability density function (PDF) estimation using Bayesian mixtures of weighted gamma distributions, called the Dirichlet process gamma mixture model (DP-GaMM), is presented and applied to the analysis of a laser beam in turbulence. The problem is cast in a Bayesian setting, with the mixture model itself treated as random process. A stick-breaking interpretation of the Dirichlet ...

Journal: :Computational Statistics & Data Analysis 2012
Stevenn Volant Marie-Laure Martin-Magniette Stéphane Robin

We consider a binary unsupervised classification problem where each observation is associated with an unobserved label that we want to retrieve. More precisely, we assume that there are two groups of observation: normal and abnormal. The ‘normal’ observations are coming from a known distribution whereas the distribution of the ‘abnormal’ observations is unknown. Several models have been develop...

2009
FLORENTINA BUNEA ALEXANDRE B. TSYBAKOV MARTEN H. WEGKAMP

This paper studies sparse density estimation via l1 penalization (SPADES). We focus on estimation in high-dimensional mixture models and nonparametric adaptive density estimation. We show, respectively, that SPADES can recover, with high probability, the unknown components of a mixture of probability densities and that it yields minimax adaptive density estimates. These results are based on a g...

2006
Luis Buera Eduardo Lleida Juan Arturo Nolazco-Flores Antonio Miguel Alfonso Ortega

In a previous work, Multi-Environment Model based LInear Normalization, MEMLIN, was presented and it was proved to be effective to compensate environment mismatch. MEMLIN is an empirical feature vector normalization which models clean and noisy spaces by Gaussian Mixture Models (GMMs). In this algorithm, the probability of the clean model Gaussian, given the noisy model one and the noisy featur...

2005
Konstantin Markov Satoshi Nakamura

Most of the current state-of-the-art speech recognition systems are based on HMMs which usually use mixture of Gaussian functions as state probability distribution model. It is a common practice to use EM algorithm for Gaussian mixture parameter learning. In this case, the learning is done in a ”blind”, data-driven way without taking into account how the speech signal has been produced and whic...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید