نتایج جستجو برای: bayesian mixing model
تعداد نتایج: 2196729 فیلتر نتایج به سال:
We introduce a Bayesian stable isotope mixing model for estimating the relative contributions of different dietary components to tissues consumers within food webs. The is implemented with probabilistic programming language Stan. incorporates isotopes multiple elements (e.g. C, N, H) two trophic levels, when structure web known. In addition, allows inclusion latent levels (i.e. which no empiric...
This paper develops default priors for Bayesian analysis that reproduce familiar frequentist and Bayesian analyses for models that are exponential or location. For the vector parameter case there is an information adjustment that avoids the Bayesian marginalization paradoxes and properly targets the prior on the parameter of interest thus adjusting for any complicating nonlinearity the details ...
A numerical model for simulating Residence Time Distribution (RTD) of turbulent flows in helical static mixers is proposed and developed to improve the understanding of static mixers. The results of this model is presented in terms of different volumetric flow rate to illustrate the complicated flow patterns that drive the mixing process i...
this article examines statistical inference for where and are independent but not identically distributed pareto of the first kind (pareto (i)) random variables with same scale parameter but different shape parameters. the maximum likelihood, uniformly minimum variance unbiased and bayes estimators with gamma prior are used for this purpose. simulation studies which compare the estimators are ...
Background and Objectives: Pulmonary embolism is a potentially fatal and prevalent event that has led to a gradual increase in the number of hospitalizations in recent years. For this reason, it is one of the most challenging diseases for physicians. The main purpose of this paper was to report a research project to compare different data mining algorithms to select the most accurate model for ...
In this paper, we propose a novel technique to implement stochastic gradient methods, which are beneficial for learning from large datasets, through accelerated stochastic dynamics. A stochastic gradient method is based on mini-batch learning for reducing the computational cost when the amount of data is large. The stochasticity of the gradient can be mitigated by the injection of Gaussian nois...
The use of a finite dimensional Dirichlet prior in the finite normal mixture model has the effect of acting like a Bayesian method of sieves. Posterior consistency is directly related to the dimension of the sieve and the choice of the Dirichlet parameters in the prior. We find that naive use of the popular uniform Dirichlet prior leads to an inconsistent posterior. However, a simple adjustment...
Particle Metropolis-Hastings enables Bayesian parameter inference in general nonlinear state space models (SSMs). However, in many implementations a random walk proposal is used and this can result in poor mixing if not tuned correctly using tedious pilot runs. Therefore, we consider a new proposal inspired by quasi-Newton algorithms that may achieve better mixing with less tuning. An advantage...
The mixture of Bernoulli distributions [6] is a technique that is frequently used for the modeling of binary random vectors. They differ from (restricted) Boltzmann Machines in that they do not model the marginal distribution over the binary data space X as a product of (conditional) Bernoulli distributions, but as a weighted sum of Bernoulli distributions. Despite the non-identifiability of th...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید