نتایج جستجو برای: markov parameter
تعداد نتایج: 281519 فیلتر نتایج به سال:
In their 2008 and 2009 papers, Sumner and colleagues introduced the “squangles” – a small set of Markov invariants for phylogenetic quartets. The squangles are consistent with the general Markov model (GM) and can be used to infer quartets without the need to explicitly estimate all parameters. As GM is inhomogeneous and hence non-stationary, the squangles are expected to perform well compared ...
We propose a general class of Markov-switching-ARFIMA processes in order to combine strands of long memory and Markov-switching literature. Although the coverage of this class of models is broad, we show that these models can be easily estimated with the DLV algorithm proposed. This algorithm combines the Durbin-Levinson and Viterbi procedures. A Monte Carlo experiment reveals that the finite s...
Bounded-parameter Markov decision process (BMDP) can be used to model sequential decision problems, where the transitions probabilities are not completely know and are given by intervals. One of the criteria used to solve that kind of problems is the maximin, i.e., the best action on the worst scenario. The algorithms to solve BMDPs that use this approach include interval value iteration and an...
We consider the estimation of hidden Markovian process by using information geometry with respect to transition matrices. We consider the case when we use only the histogram of k-memory data. Firstly, we focus on a partial observation model with Markovian process and we show that the asymptotic estimation error of this model is given as the inverse of projective Fisher information of transition...
Particle Swarm Optimizer (PSO) is such a complex stochastic process so that analysis on the stochastic behavior of the PSO is not easy. The choosing of parameters plays an important role since it is critical in the performance of PSO. As far as our investigation is concerned, most of the relevant researches are based on computer simulations and few of them are based on theoretical approach. In ...
We describe an algorithm for locally-adaptive parameter estimation of spatially inhomogeneous Markov random elds (MRFs). In particular, we establish that there is a unique solution which maximizes the local pseudo-likelihood in the inhomogeneous MRF model. Subsequently we demonstrate how Besag's iterative conditional mode (ICM) procedure can be generalized from homogeneous MRFs to inhomogeneous...
Approximate Bayesian computation (ABC) is a popular technique for approximating likelihoods and is often used in parameter estimation when the likelihood functions are analytically intractable. Although the use of ABC is widespread in many fields, there has been little investigation of the theoretical properties of the resulting estimators. In this paper we give a theoretical analysis of the as...
This paper shows how to formally characterize language learning in a finite parameter space as a Markov structure, hnportant new language learning results follow directly: explicitly calculated sample complexity learning times under different input distribution assumptions (including CHILDES database language input) and learning regimes. We also briefly describe a new way to formally model (rap...
This chapter considers stochastic differential equations for Systems Biology models derived from the Chemical Langevin Equation (CLE). After outlining the derivation of such models, Bayesian inference for the parameters is considered, based on state-of-the-art Markov chain Monte Carlo algorithms. Starting with a basic scheme for models observed perfectly, but discretely in time, problems with s...
Markov decision processes are an effective tool in modeling decision-making in uncertain dynamic environments. Since the parameters of these models are typically estimated from data or learned from experience, it is not surprising that the actual performance of a chosen strategy often significantly differs from the designer’s initial expectations due to unavoidable modeling ambiguity. In this p...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید