نتایج جستجو برای: rényi entropy

تعداد نتایج: 70986  

Journal: :CoRR 2017
Alberto Enciso Piergiulio Tempesta

The requirement that an entropy function be composable is key: it means that the entropy of a compound system can be calculated in terms of the entropy of its independent components. We prove that, under mild regularity assumptions, the only composable generalized entropy in trace form is the Tsallis one-parameter family (which contains Boltzmann–Gibbs as a particular case). This result leads t...

Journal: :CoRR 2009
Jean-François Bercher

Article history: Received 20 May 2009 Received in revised form 2 July 2009 Accepted 7 July 2009 Available online 15 July 2009 Communicated by A.R. Bishop PACS: 02.50.-r 05.90.+m 89.70.+c

Journal: :CoRR 2017
Mokshay M. Madiman Liyao Wang Jae Oh Woo

Lower bounds for the Rényi entropies of sums of independent random variables taking values in cyclic groups of prime order, or in the integers, are established. The main ingredients of our approach are extended rearrangement inequalities in prime cyclic groups building on Lev (2001), and notions of stochastic ordering. Several applications are developed, including to discrete entropy power ineq...

2015
Philippe Elbaz-Vincent Herbert Gangl

We show that the entropy function—and hence the finite 1-logarithm—behaves a lot like certain derivations. We recall its cohomological interpretation as a 2-cocycle and also deduce 2n-cocycles for any n. Finally, we give some identities for finite multiple polylogarithms together with number theoretic applications. 1 Information theory, Entropy and Polylogarithms It is well known that the notio...

Journal: :CoRR 2014
Ashok Kumar Moses Rajesh Sundaresan

Minimization problems with respect to a one-parameter family of generalized relative entropies are studied. These relative entropies, which we term relative α-entropies (denoted Iα), arise as redundancies under mismatched compression when cumulants of compressed lengths are considered instead of expected compressed lengths. These parametric relative entropies are a generalization of the usual r...

Journal: :Probl. Inf. Transm. 2013
Mladen Kovacevic Ivan Stanojevic Vojin Senk

In this paper we study certain properties of Rényi entropy functions Hα(P) on the space of discrete probability distributions with infinitely many probability masses. We prove some properties that parallel those known in the finite case. Some properties on the other hand are quite different in the infinite case, for example the (dis)continuity in P and the problem of divergence and behaviour of...

Journal: :Entropy 2015
Nadia Mammone Jonas Duun-Henriksen Troels Wesenberg Kjaer Francesco Carlo Morabito

Permutation entropy (PE) has been widely exploited to measure the complexity of the electroencephalogram (EEG), especially when complexity is linked to diagnostic information embedded in the EEG. Recently, the authors proposed a spatial-temporal analysis of the EEG recordings of absence epilepsy patients based on PE. The goal here is to improve the ability of PE in discriminating interictal sta...

Journal: :IEEE Trans. Information Theory 2002
Onur G. Guleryuz Erwin Lutwak Deane Yang Gaoyong Zhang

We show that for a special class of probability distributions that we call contoured distributions, information theoretic invariants and inequalities are equivalent to geometric invariants and inequalities of bodies in Euclidean space associated with the distributions. Using this, we obtain characterizations of contoured distributions with extremal Shannon and Renyi entropy. We also obtain a ne...

2012
Lavanya Sivakumar Matthias Dehmer

In this article, we discuss the problem of establishing relations between information measures for network structures. Two types of entropy based measures namely, the Shannon entropy and its generalization, the Rényi entropy have been considered for this study. Our main results involve establishing formal relationships, by means of inequalities, between these two kinds of measures. Further, we ...

2017
Lei Yu Vincent Y. F. Tan

The conventional channel resolvability problem refers to the determination of the minimum rate needed for an input process to approximate the output distribution of a channel in either the total variation distance or the relative entropy. In contrast to previous works, in this paper, we use the (normalized or unnormalized) Rényi divergence (with the Rényi parameter in [0,2]) to measure the leve...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید