نتایج جستجو برای: conditional rényi entropy

تعداد نتایج: 128234  

Journal: :Inf. Sci. 2013
M. Gil Fady Alajaji Tamás Linder

Probabilistic ‘distances’ (also called divergences), which in some sense assess how ‘close’ two probability distributions are from one another, have been widely employed in probability, statistics, information theory, and related fields. Of particular importance due to their generality and applicability are the Rényi divergence measures. This paper presents closed-form expressions for the Rényi...

1997
Christian Cachin

One of the most important properties of a cryptographic system is a proof of its security. In the present work, information-theoretic methods are used for proving the security of unconditionally secure cryptosystems. The security of such systems does not depend on unproven intractability assumptions. A survey of entropy measures and their applications in cryptography is presented. A new informa...

Journal: :IEEE Trans. Information Theory 2014
Tim van Erven Peter Harremoës

Rényi divergence is related to Rényi entropy much like Kullback-Leibler divergence is related to Shannon’s entropy, and comes up in many settings. It was introduced by Rényi as a measure of information that satisfies almost the same axioms as Kullback-Leibler divergence, and depends on a parameter that is called its order. In particular, the Rényi divergence of order 1 equals the Kullback-Leibl...

Journal: :Classical and Quantum Gravity 2021

Journal: :IEEE Transactions on Information Theory 2016

Journal: :Inf. Sci. 2003
Luc Knockaert

Rényi entropies are compared to generalized log-Fisher information and variational entropies in the context of translation, scale and concentration invariance. It is proved that the Rényi entropies occupy a special place amongst these entropies. It is also shown that Shannon entropy is centrally positioned amidst the Rényi entropies.

2011
Balaji Vasan Srinivasan Ramani Duraiswami

Rényi entropy refers to a generalized class of entropies that have been used in several applications. In this work, we derive a non-parametric distance between distributions based on the quadratic Rényi entropy. The distributions are estimated via Parzen density estimates. The quadratic complexity of the distance evaluation is mitigated with GPUbased parallelization. This results in an efficien...

2009
G. Baris Bağcı

We propose a generalized entropy maximization procedure, which takes into account the generalized averaging procedures and information gain definitions underlying the generalized entropies. This novel generalized procedure is then applied to Rényi and Tsallis entropies. The generalized entropy maximization procedure for Rényi entropies results in the exponential stationary distribution asymptot...

Journal: :Open Systems & Information Dynamics 2003

2011
MANUEL GIL Fady Alajaji

The idea of ‘probabilistic distances’ (also called divergences), which in some sense assess how ‘close’ two probability distributions are from one another, has been widely employed in probability, statistics, information theory, and related fields. Of particular importance due to their generality and applicability are the Rényi divergence measures. While the closely related concept of Rényi ent...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید