نتایج جستجو برای: continuous time markov chain

تعداد نتایج: 2344467  

Journal: :Automatica 2011
Gang George Yin Yu Sun Le Yi Wang

This paper is concerned with asymptotic properties of consensus-type algorithms for networked systems whose topologies switch randomly. The regime-switching process is modeled as a discrete-time Markov chain with a finite state space. The consensus control is achieved by using stochastic approximation methods. In the setup, the regime-switching process (the Markov chain) contains a rate paramet...

2011
Ionuţ Florescu Forrest Levin

In this work we present a methodology for estimating the variability of a signal modeled as a continuous time stochastic process observable only at discrete times with variability modeled by a continuous time Markov chain. The methodology estimates the parameters of the hidden Markov chain. The methodology is new however the major contribution of this work comes in the realm of applications. Th...

2005
Mequanint A. Moges Thomas G. Robertazzi

In this paper the equivalence between various divisible load-scheduling policies and continuous time Markov chains is demonstrated. The problem is to show optimal divisible load schedules for various network topologies have Markov chain analogs. This paper is a continuation of our initial short paper [1] that introduced this unification between divisible load theory and Markov chain models for ...

2012
STEVEN P. LALLEY

Discrete-time Markov chains are useful in simulation, since updating algorithms are easier to construct in discrete steps. They can also be useful as crude models of physical, biological, and social processes. However, in the physical and biological worlds time runs continuously, and so discrete-time mathematical models are not always appropriate. This is especially true in population biology –...

2009
Tomasz R. Bielecki Stéphane Crépey Alexander Herbertsson

2 Continuous-Time Markov Chains 3 2.1 Time-homogeneous chains . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4 2.2 Time-inhomogeneous chains . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5 2.3 Embedded Discrete-Time Markov Chain . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7 2.4 Conditional Expectations . . . . . . . . . . . . . . . . . . ...

Journal: :CoRR 2006
Tal El-Hay Nir Friedman Daphne Koller Raz Kupferman

A central task in many applications is reasoning about processes that change over continuous time. Recently, Nodelman et al. introduced continuous time Bayesian networks (CTBNs), a structured representation for representing Continuous Time Markov Processes over a structured state space. In this paper, we introduce continuous time Markov networks (CTMNs), an alternative representation language t...

2000
Ed Brinksma Holger Hermanns

This paper surveys and relates the basic concepts of process algebra and the modelling of continuous time Markov chains. It provides basic introductions to both fields, where we also study the Markov chains from an algebraic perspective, viz. that of Markov chain algebra. We then proceed to study the interrelation of reactive processes and Markov chains in this setting, and introduce the algebr...

2010
OGUZHAN ALAGOZ James J. Cochran

Continuous-time Markov decision processes (CTMDP) may be viewed as a specialcase of semi-Markov decision processes (SMDP) where the intertransition times are exponen-tially distributed and the decision maker is allowed to choose actions whenever the systemstate changes. When the transition rates are identical for each state and action pair, one canconvert a CTMDP into an equival...

2005
MATTHEW SPENCER EDWARD SUSKO

Discrete-time Markov chains are widely used to study communities of competing sessile species. Their parameters are transition probabilities between states (species found at points in space), estimated from repeated observations. The proportion of nonzero entries in the transition matrix has been suggested as a measure of the complexity of interspecific interactions. This is not accurate if mor...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید