نتایج جستجو برای: rényi entropy

تعداد نتایج: 70986  

2012
Marek Śmieja Jacek Tabor

Rényi entropy dimension describes the rate of growth of coding cost in the process of lossy data compression in the case of exponential dependence between the code length and the cost of coding. In this paper we generalize the Csiszár estimation of the Rényi entropy dimension of the mixture of measures for the case of general probability metric space. This result determines the cost of encoding...

1997
Christian Cachin

The notion of smooth entropy allows a unifying, generalized formulation of privacy ampli-cation and entropy smoothing. Smooth entropy is a measure for the number of almost uniform random bits that can be extracted from a random source by probabilistic algorithms. It is known that the R enyi entropy of order at least 2 of a random variable is a lower bound for its smooth entropy. On the other ha...

Journal: :CoRR 2016
Yuta Sakai Ken-ichi Iwata

Many axiomatic definitions of entropy, such as the Rényi entropy, of a random variable are closely related to the `α-norm of its probability distribution. This study considers probability distributions on finite sets, and examines the sharp bounds of the `β-norm with a fixed `α-norm, α 6= β, for n-dimensional probability vectors with an integer n ≥ 2. From the results, we derive the sharp bound...

Journal: :Physical Review D 2014

2016
Xi Dong

A remarkable yet mysterious property of black holes is that their entropy is proportional to the horizon area. This area law inspired the holographic principle, which was later realized concretely in gauge-gravity duality. In this context, entanglement entropy is given by the area of a minimal surface in a dual spacetime. However, discussions of area laws have been constrained to entanglement e...

Journal: :IEEE Trans. Information Theory 2001
Po-Ning Chen Fady Alajaji

Csiszár’s forward -cutoff rate (given a fixed 0) for a discrete source is defined as the smallest number such that for every , there exists a sequence of fixed-length codes of rate with probability of error asymptotically vanishing as . For a discrete memoryless source (DMS), the forward -cutoff rate is shown by Csiszár [6] to be equal to the source Rényi entropy. An analogous concept of revers...

Journal: :IEEE Transactions on Information Theory 2016

2013
Stefan Berens Serge Fehr CWI Amsterdam Richard Gill Claude Elwood

The introduction of the Rényi entropy allowed a generalization of the Shannon entropy and unified its notion with that of other entropies. However, so far there is no generally accepted conditional version of the Rényi entropy corresponding to the one of the Shannon entropy. Different definitions proposed so far in the literature lacked central and natural properties one way or another. In this...

Journal: :CoRR 2015
Mario Berta Omar Fawzi Marco Tomamichel

Distance measures between quantum states like the trace distance and the fidelity can naturally be defined by optimizing a classical distance measure over all measurement statistics that can be obtained from the respective quantum states. In contrast, Petz showed that the measured relative entropy, defined as a maximization of the Kullback-Leibler divergence over projective measurement statisti...

Journal: :CoRR 2017
Tongyang Li Xiaodi Wu

Estimation of Shannon and Rényi entropies of unknown discrete distributions is a fundamental problem in statistical property testing and an active research topic in both theoretical computer science and information theory. Tight bounds on the number of samples to estimate these entropies have been established in the classical setting, while little is known about their quantum counterparts. In t...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید