نتایج جستجو برای: irreducible aperiodic markov chain

تعداد نتایج: 352282  

2014
MOODY T. CHU SHENG-JHIH WU

It is known that the second dominant eigenvalue of a matrix determines the convergence rate of the power method. Though ineffective for general eigenvalue computation, the power method has been of practical usage for computing the stationary distribution of a stochastic matrix. For a Markov chain with memory m, the transition “matrix" becomes an order-m tensor. Under suitable assumptions, the s...

2015
James R. Lee

Consider a nite state space Ω and a transition kernel P : Ω ×Ω→ [0, 1] such that for every x ∈ Ω, ∑ y∈Ω P(x , y) 1. The Markov chain corresponding to the kernel P is the sequence of random variables {X0 ,X1 ,X2 , . . .} such that for every t > 0, we have [Xt+1 y | Xt x] P(x , y). Note that we also have to specify a distribution for the initial state X0. Corresponding to every such process, one...

پایان نامه :وزارت علوم، تحقیقات و فناوری - دانشگاه شهید باهنر کرمان - دانشکده ریاضی و کامپیوتر 1391

در این پایان نامه ابتدا انواع فرایندهای تصادفی را بیان کرده و به ذکر خصوصیات هر نوع می پردازیم. به ویژه گونه ای از فرایندهای تصادفی را که تحت عنوان زنجیرهای مارکف شناخته شده اند بررسی می کنیم. سپس مفهوم آنتروپی و نرخ آنتروپی را مطرح و همچنین، نحوه به دست آوردن آنتروپی شرطی رنی با استفاده از قواعدی را که قبلاً بیان شده اند، نشان می دهیم. بر مبنای این آنتروپی شرطی رنی، نرخ آنتروپی رنی در فرایندهای...

2006
RUITAO LIU

Consider a parametric statistical model P (dx|θ) and an improper prior distribution ν(dθ) that together yield a (proper) formal posterior distribution Q(dθ|x). The prior is called strongly admissible if the generalized Bayes estimator of every bounded function of θ is admissible under squared error loss. Eaton [Ann. Statist. 20 (1992) 1147–1179] has shown that a sufficient condition for strong ...

Journal: :Math. Meth. of OR 2005
Karel Sladký

As an extension of the discrete-time case, this note investigates the variance of the total cumulative reward for the embedded Markov chain of semi-Markov processes. Under the assumption that the chain is aperiodic and contains a single class of recurrent states recursive formulae for the variance are obtained which show that the variance growth rate is asymptotically linear in time. Expression...

2015
Joost Berkhout Bernd Heidergott JOOST BERKHOUT

This article presents a new numerical method for approximately computing the ergodic projector of a finite aperiodic Markov chain. Our approach requires neither structural information on the chain, such as, the identification of ergodic classes, transient states, or qualitative information, such as, whether the chain is nearly decomposable or not. The theoretical deduction of the new method is ...

2008
Galin L. Jones Alicia A. Johnson

It is our pleasure to congratulate the authors (hereafter DKSC) on an interesting paper that was a delight to read. While DKSC provide a remarkable collection of connections between different representations of the Markov chains in their paper, we will focus on the “running time analysis” portion. This is a familiar problem to statisticians; given a target population, how can we obtain a repres...

2008

Proof of Lemma A.3. Let Assumptions 3.2 and 3.3 hold. Assume that firms follow a common oblivious strategy µ ∈ ˜ M, the expected entry rate is λ ∈ ˜ Λ, and the expected time that each firm spends in the industry is finite. Let {Z x : x ∈ N} be a sequence of independent Poisson random variables with means {˜s µ,λ (x) : x ∈ N}, and let Z be a Poisson random variable with mean x∈N˜s µ,λ (x). Then,...

2015
Jonathan Goodman

Markov chain Monte Carlo, or MCMC, is a way to sample probability distributions that cannot be sampled practically using direct samplers. Most complex probability distributions in more than a few variables are are sampled in this way. For us, a stationary Markov chain is a random sequence X1, X2, . . ., where Xk+1 = M(Xk, ξk), where M(x, ξ) is a fixed function and the inputs ξ are i.i.d. random...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید