نتایج جستجو برای: rényi entropy rate

تعداد نتایج: 1024701  

Journal: :CoRR 2005
Ambedkar Dukkipati M. Narasimha Murty Shalabh Bhatnagar

By replacing linear averaging in Shannon entropy with Kolmogorov-Nagumo average (KN-averages) or quasilinear mean and further imposing the additivity constraint, Rényi proposed the first formal generalization of Shannon entropy. Using this recipe of Rényi, one can prepare only two information measures: Shannon and Rényi entropy. Indeed, using this formalism Rényi characterized these additive en...

Journal: :IEEE Trans. Information Theory 2001
Ziad Rached Fady Alajaji L. Lorne Campbell

In this work, we examine the existence and the computation of the Rényi divergence rate, lim ( ), between two time-invariant finite-alphabet Markov sources of arbitrary order and arbitrary initial distributions described by the probability distributions and , respectively. This yields a generalization of a result of Nemetz where he assumed that the initial probabilities under and are strictly p...

2015
Wu - Zhong Guo Song He

We study Rényi entropy of locally excited states with considering the thermal and boundary effects respectively in two dimensional conformal field theories (CFTs). Firstly, we consider locally excited states obtained by acting primary operators on a thermal state in low temperature limit. The Rényi entropy is summation of contribution from thermal effect and local excitation. Secondly, we mainl...

2017
Igal Sason Sergio Verdú

This paper gives upper and lower bounds on the minimum error probability of Bayesian M -ary hypothesis testing in terms of the Arimoto-Rényi conditional entropy of an arbitrary order α. The improved tightness of these bounds over their specialized versions with the Shannon conditional entropy (α = 1) is demonstrated. In particular, in the case where M is finite, we show how to generalize Fano’s...

2011
MANUEL GIL Fady Alajaji

The idea of ‘probabilistic distances’ (also called divergences), which in some sense assess how ‘close’ two probability distributions are from one another, has been widely employed in probability, statistics, information theory, and related fields. Of particular importance due to their generality and applicability are the Rényi divergence measures. While the closely related concept of Rényi ent...

2008
Oliver Johnson Christophe Vignat

English abstract: We consider the Student-t and Student-r distributions, which maximise Rényi entropy under a covariance condition. We show that they have information-theoretic properties which mirror those of the Gaussian distributions, which maximise Shannon entropy under the same condition. We introduce a convolution which preserves the Rényi maximising family, and show that the Rényi maximi...

Journal: :IEEE Trans. Information Theory 2014
Masahito Hayashi

It is known that the security evaluation can be done by smoothing of Rényi entropy of order 2 in the classical and quantum settings when we apply universal2 hash functions. Using the smoothing of Rényi entropy of order 2, we derive security bounds for L1 distinguishability and modified mutual information criterion under the classical and quantum setting, and have derived these exponential decre...

Journal: :CoRR 2011
Hongfei Cui Jianqiang Sun Yiming Ding

We consider the generalized differential entropy of normalized sums of independent and identically distributed (IID) continuous random variables. We prove that the Rényi entropy and Tsallis entropy of order α (α > 0) of the normalized sum of IID continuous random variables with bounded moments are convergent to the corresponding Rényi entropy and Tsallis entropy of the Gaussian limit, and obtai...

Journal: :CoRR 2017
Arnaud Marsiglietti James Melbourne

Using a sharp version of the reverse Young inequality, and a Rényi entropy comparison result due to Fradelizi, Madiman, and Wang, the authors are able to derive a Rényi entropy power inequality for log-concave random vectors when Rényi parameters belong to (0, 1). Furthermore, the estimates are shown to be somewhat sharp.

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید