نتایج جستجو برای: rényi

تعداد نتایج: 6772  

2011
MANUEL GIL Fady Alajaji

The idea of ‘probabilistic distances’ (also called divergences), which in some sense assess how ‘close’ two probability distributions are from one another, has been widely employed in probability, statistics, information theory, and related fields. Of particular importance due to their generality and applicability are the Rényi divergence measures. While the closely related concept of Rényi ent...

Journal: :IACR Cryptology ePrint Archive 2013
Mitsugu Iwamoto Junji Shikata

In this paper, information theoretic cryptography is discussed based on conditional Rényi entropies. Our discussion focuses not only on cryptography but also on the definitions of conditional Rényi entropies and the related information theoretic inequalities. First, we revisit conditional Rényi entropies, and clarify what kind of properties are required and actually satisfied. Then, we propose ...

Journal: :CoRR 2014
Mario Berta Kaushik P. Seshadreesan Mark M. Wilde

Quantum information measures such as the entropy and the mutual information find applications in physics, e.g., as correlation measures. Generalizing such measures based on the Rényi entropies is expected to enhance their scope in applications. We prescribe Rényi generalizations for any quantum information measure which consists of a linear combination of von Neumann entropies with coefficients...

2014
Mitsugu Iwamoto Junji Shikata

Information theoretic cryptography is discussed based on conditional Rényi entropies. Our discussion focuses not only on cryptography but also on the definitions of conditional Rényi entropies and the related information theoretic inequalities. First, we revisit conditional Rényi entropies, and clarify what kind of properties are required and actually satisfied. Then, we propose security criter...

2015
Masahito Hayashi Marco Tomamichel

Recently, a variety of new measures of quantum Rényi mutual information and quantum Rényi conditional entropy have been proposed, and some of their mathematical properties explored. Here, we show that the Rényi mutual information attains operational meaning in the context of composite hypothesis testing, when the null hypothesis is a fixed bipartite state and the alternate hypothesis consists o...

Journal: :CoRR 2014
Kaushik P. Seshadreesan Mark M. Wilde

where H(F)σ ≡ −Tr{σF log σF} is the von Neumann entropy of a state σF on system F and we unambiguously let ρC ≡ TrAB{ρABC} denote the reduced density operator on system C, for example. The CQMI captures the correlations present between Alice and Bob from the perspective of Charlie in the independent and identically distributed (i.i.d.) resource limit, where an asymptotically large number of cop...

Journal: :CoRR 2017
Arnaud Marsiglietti James Melbourne

Using a sharp version of the reverse Young inequality, and a Rényi entropy comparison result due to Fradelizi, Madiman, and Wang, the authors are able to derive a Rényi entropy power inequality for log-concave random vectors when Rényi parameters belong to (0, 1). Furthermore, the estimates are shown to be somewhat sharp.

Journal: :Inf. Sci. 2003
Luc Knockaert

Rényi entropies are compared to generalized log-Fisher information and variational entropies in the context of translation, scale and concentration invariance. It is proved that the Rényi entropies occupy a special place amongst these entropies. It is also shown that Shannon entropy is centrally positioned amidst the Rényi entropies.

2008
Oliver Johnson Christophe Vignat

English abstract: We consider the Student-t and Student-r distributions, which maximise Rényi entropy under a covariance condition. We show that they have information-theoretic properties which mirror those of the Gaussian distributions, which maximise Shannon entropy under the same condition. We introduce a convolution which preserves the Rényi maximising family, and show that the Rényi maximi...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید