نتایج جستجو برای: mixture cure model
تعداد نتایج: 2197597 فیلتر نتایج به سال:
2 Statistical problems 8 2.1 Introductory example . . . . . . . . . . . . . . . . . . . . . . . . . . 8 2.2 The independence problem . . . . . . . . . . . . . . . . . . . . . . . 11 2.3 The symmetry problem . . . . . . . . . . . . . . . . . . . . . . . . . 13 2.4 Competition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14 2.5 The role of mixture models . . . . . . . . . . . . . ....
Our paper provides a brief review and summary of issues and advances in the use of latent structure and other finite mixture models in the analysis of choice data. Focus is directed to three primary areas: (I) estimation and computational issues, (2) specification and interpretation issues, and (3) future research issues. We comment on what latent structure models have promised, what has been, ...
Various models have been implemented to explain long-term memory (Brady, et al., 2013; Lew, et al., 2015), with some being derived from studies of visual working memory (Bays, et al. 2009; Zhang & Luck, 2008). The implicit assumption is that processes and mechanisms of working memory also exist in long-term memory. However, the findings of fidelity and contributing factors are highly varied (e....
A Bayesian-based methodology is presented which automatically penalises over-complex models being tted to unknown data. We show that, with a Gaussian mixture model, the approach is able to select anòptimal' number of components in the model and so partition data sets.
Word embeddings are now a standard technique for inducing meaning representations for words. For getting good representations, it is important to take into account different senses of a word. In this paper, we propose a mixture model for learning multi-sense word embeddings. Our model generalizes the previous works in that it allows to induce different weights of different senses of a word. The...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید