نتایج جستجو برای: entropy rate

تعداد نتایج: 1019686  

2005
Charalambos D. Charalambous Alireza Farhadi

In this paper the notion of robust entropy and subsequently, robust entropy rate for a family of discrete time uncertain sources is introduced. When the uncertainty is described by a relative entropy constraint between the set of uncertain source densities and a given nominal source density, the solution to this robust notion of information is presented and its connection with other notions of ...

2008
Stefan M. Moser

Stationarity is investigated in the context of a general multiple-input–multipleoutput (MIMO) fading channel with memory. It is shown that stationary processes behave nicely as one does expect. Concretely, this reports concentrates on three topics with regard to stationarity: firstly it is proven that—under weak conditions on the channel—any stationary channel model will have a capacity-achievi...

2007
Or Zuk

The relative entropy rate is a natural and useful measure of distance between two stochastic processes. In this paper we study the relative entropy rate between two Hidden Markov Processes (HMPs), which is of both theoretical and practical importance. We give new results showing analyticity, representation using Lyapunov exponents, and Taylor expansion for the relative entropy rate of two discr...

Journal: :CoRR 2017
Maciej Skorski

It is well established that the notion of min-entropy fails to satisfy the chain rule of the form H(X,Y ) = H(X|Y )+ H(Y ), known for Shannon Entropy. The lack of a chain rule causes a lot of technical difficulties, particularly in cryptography where the chain rule would be a natural way to analyze how min-entropy is split among smaller blocks. Such problems arise for example when constructing ...

1999
Ziad Rached Fady Alajaji

In this work, we extend a variable-length source coding theorem for discrete memoryless sources to ergodic time-invariant Markov sources of arbitrary order. To accomplish this extension, we establish a formula for the R enyi entropy rate limn!1 H(n)=n. The main tool used to obtain the R enyi entropy rate result is Perron-Frobenius theory. We also examine the expression of the R enyi en-tropy ra...

2000
Niels Wessel Agnes Schumann Alexander Schirdewan Andreas Voss Jürgen Kurths

Standard parameters of heart rate variability are restricted in measuring linear effects, whereas nonlinear descriptions often suffer from the curse of dimensionality. An approach which might be capable of assessing complex properties is the calculation of entropy measures from normalised periodograms. Two concepts, both based on autoregressive spectral estimations are introduced here. To test ...

Journal: :Physical review. E, Statistical, nonlinear, and soft matter physics 2011
Kun Zhao Arda Halu Simone Severini Ginestra Bianconi

New entropy measures have been recently introduced for the quantification of the complexity of networks. Most of these entropy measures apply to static networks or to dynamical processes defined on static complex networks. In this paper we define the entropy rate of growing network models. This entropy rate quantifies how many labeled networks are typically generated by the growing network mode...

2011
Unto K. Laine

A new method for inferring specific stochastic grammars is presented. The process called Hybrid Model Learner (HML) applies entropy rate to guide the agglomeration process of type ab->c. Each rule derived from the input sequence is associated with a certain entropy-rate difference. A grammar automatically inferred from an example sequence can be used to detect and recognize similar structures i...

2004
Yun Gao Ioannis Kontoyiannis Elie Bienenstock

Information-theoretic methods have been widely used in neuroscience, in the broad effort to analyze and understand the fundamental informationprocessing tasks performed by the brain. In these studies, the entropy has been adopted as the main measure for quantifying the amount of information transmitted between neurons, via the spike trains they generate. One of the first and most important goal...

Journal: :IEEE Trans. Information Theory 1993
George Kesidis Jean C. Walrand

We derive the relative entropy between two Markov transition rate matrices from sample path considerations. This relative entropy is interpreted as a \level 2.5" large deviations action functional. That is, the level two large deviations action functional for empirical distributions of continuous-time Markov chains can be derived from the relative entropy using the contraction mapping principle...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید