نتایج جستجو برای: linear mixture model

تعداد نتایج: 2516495  

2016

Mixtures of Zellner’s g-priors have been studied extensively in linear models and have been shown to have numerous desirable properties for Bayesian variable selection and model averaging. Several extensions of g-priors to Generalized Linear Models (GLMs) have been proposed in the literature; however, the choice of prior distribution of g and resulting properties for inference have received con...

Journal: :transport phenomena in nano and micro scales 2014
h. safikhani a. abbassi s. ghanami

in the present study, computational fluid dynamics (cfd) techniques and artificial neural networks (ann) are used to predict the pressure drop value (δp ) of al2o3-water nanofluid in flat tubes. δp  is predicted taking into account five input variables: tube flattening (h), inlet volumetric flow rate (qi  ), wall heat flux (qnw  ), nanoparticle volume fraction (φ) and nanoparticle diameter (dp ...

2017
Loren E. Smith Derek K. Smith Jeffrey D. Blume Edward D. Siew Frederic T. Billings

BACKGROUND Acute kidney injury (AKI) is diagnosed based on postoperative serum creatinine change, but AKI models have not consistently performed well, in part due to the omission of clinically important but practically unmeasurable variables that affect creatinine. We hypothesized that a latent variable mixture model of postoperative serum creatinine change would partially account for these unm...

2009
Ryohei Fujimaki Satoshi Morinaga Michinari Momma Kenji Aoki Takayuki Nakata

Our main contribution is to propose a novel model selection methodology, expectation minimization of information criterion (EMIC). EMIC makes a significant impact on the combinatorial scalability issue pertaining to the model selection for mixture models having types of components. A goal of such problems is to optimize types of components as well as the number of components. One key idea in EM...

ژورنال: اندیشه آماری 2014
abedini, mahsa, kazemi, iraj,

In previous studies on fitting non-linear regression models with the symmetric structure the normality is usually assumed in the analysis of data. This choice may be inappropriate when the distribution of residual terms is asymmetric. Recently, the family of scale-mixture of skew-normal distributions is the main concern of many researchers. This family includes several skewed and heavy-tailed d...

Journal: :Pattern Recognition 2013
Guoqing Liu Jianxin Wu Suiping Zhou

Most of the existing probabilistic classifiers are based on sparsity-inducing modeling. However, we show that sparsity is not always desirable in practice, and only an appropriate degree of sparsity is profitable. In this work, we propose a flexible probabilistic model using a generalized Gaussian scale mixture (GGSM) prior that can provide an appropriate degree of sparsity for its model parame...

Journal: :Statistics and Computing 2011
Jun Ma Sigurbjorg Gudlaugsdottir Graham Wood

We consider independent sampling from a two-component mixture distribution, where one component (called the parametric component) is from a known distributional family and the other component (called the non-parametric component) is unknown. This is a semi-parametric mixture distribution. We discretize the non-parametric component and estimate the parameters of this mixture model, namely the mi...

Journal: :CoRR 2017
Cinzia Viroli Geoffrey J. McLachlan

Deep learning is a hierarchical inference method formed by subsequent multiple layers of learning able to more efficiently describe complex relationships. In this work, Deep Gaussian Mixture Models are introduced and discussed. A Deep Gaussian Mixture model (DGMM) is a network of multiple layers of latent variables, where, at each layer, the variables follow a mixture of Gaussian distributions....

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید