نتایج جستجو برای: irreducible aperiodic markov chain
تعداد نتایج: 352282 فیلتر نتایج به سال:
for an m-state homogeneous irreducible Markov chain with transition probability matrix Qm×m is known, but the chain requires updating by altering some of its transition probabilities or by adding or deleting some states. Suppose that the updated transition probability matrix Pn×n is also irreducible. The updating problem is to compute the updated stationary distribution π = (π1, π2, . . . , πn)...
Consider an irreducible Markov chain which satisfies a ratio limit theorem, and let $\rho$ be the spectral radius of chain. We investigate relation $\rho\,$-Martin boundary with induced by $\rho\,$-harmonic kernel appears in limit. Special emphasis is on random walks non-amenable groups, specifically, free groups hyperbolic groups.
Consider a finite state space Ω and a transition kernel P : Ω ×Ω→ [0, 1] such that for every x ∈ Ω, ∑ y∈Ω P(x , y) 1. The Markov chain corresponding to the kernel P is the sequence of random variables {X0 ,X1 ,X2 , . . .} such that for every t > 0, we have [Xt+1 y | Xt x] P(x , y). Note that we also have to specify a distribution for the initial state X0. Corresponding to every such process, o...
We consider a Markov chain in continuous time with one absorbing state and a finite set S of transient states. When S is irreducible the limiting distribution of the chain as t → ∞, conditional on survival up to time t, is known to equal the (unique) quasi-stationary distribution of the chain. We address the problem of generalizing this result to a setting in which S may be reducible, and show ...
Let P be the transition matrix of a finite, irreducible and reversible Markov chain. We say the continuous time Markov chain X has transition matrix P and speed λ if it jumps at rate λ according to the matrix P . Fix λX , λY , λZ ≥ 0, then let X,Y and Z be independent Markov chains with transition matrix P and speeds λX , λY and λZ respectively, all started from the stationary distribution. Wha...
Exact calculations for probabilities on complex pedigrees are computationally intensive and very often infeasible. Markov chain Monte Carlo methods are frequently used to approximate probabilities and likelihoods of interest. However, when a locus with more than two alleles is considered , the underlying Markov chain is not guaranteed to be irreducible and the results of such analyses are unrel...
We consider a process in which information is transmitted from a given root node on a noisy d-ary tree network T . We start with a uniform symbol taken from an alphabet A. Each edge of the tree is an independent copy of some channel (Markov chain) M , where M is irreducible and aperiodic on A. The goal is to reconstruct the symbol at the root from the symbols at the nth level of the tree. This ...
This work focuses on tracking and system identification of systems with regime-switching parameters, which are modeled by a Markov process. It introduces a framework for persistent identification problems that encompass many typical system uncertainties, including parameter switching, stochastic observation disturbances, deterministic unmodeled dynamics, sensor observation bias, and nonlinear m...
A sequence of random variables {Xn}n≥0 is called regenerative if it can be broken up into iid components. The problem addressed in this paper is to determine under what conditions is a Markov chain regenerative. It is shown that an irreducible Markov chain with a countable state space is regenerative for any initial distribution if and only if it is recurrent (null or positive). An extension of...
We consider moduloadditive noise channels, where the noise process is a stationary irreducible and aperiodic Markov chain of order . We begin by investigating the capacity-cost function ( ( )) of such additive-noise channels without feedback. We establish a tight upper bound to ( ( )) which holds for general (not necessarily Markovian) stationary -ary noise processes. This bound constitutes the...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید