نتایج جستجو برای: rényi entropy rate
تعداد نتایج: 1024701 فیلتر نتایج به سال:
Estimating entropies is important in many fields including statistical physics, machine learning and statistics. While the Shannon logarithmic entropy is the most fundamental, other Rényi entropies are also of importance. In this paper, we derive a bias corrected estimator for a subset of Rényi entropies. The advantage of the estimator is demonstrated via theoretical and experimental considerat...
In 1959, Rényi proposed the information dimension and the d-dimensional entropy to measure the information content of general random variables. This paper proposes a generalization of information dimension to stationary stochastic processes by defining the information dimension rate as the entropy rate of the uniformly-quantized stochastic process divided by minus the logarithm of the quantizer...
A bstract In this work, we revisit the calculation of Rényi entropy in AdS 3 /(B)CFT 2 . We find that gravity solutions with brane intersection will lead to negative entropy.
The exact range of the joined values of several Rényi entropies is determined. The method is based on topology with special emphasis on the orientation of the objects studied.
The fully quantum reverse Shannon theorem establishes the optimal rate of noiseless classical communication required for simulating the action of many instances of a noisy quantum channel on an arbitrary input state, while also allowing for an arbitrary amount of shared entanglement of an arbitrary form. Turning this theorem around establishes a strong converse for the entanglement-assisted cla...
For all p > 1, we demonstrate the existence of quantum channels with non-multiplicative maximal output p-norms. Equivalently, for all p > 1, the minimum output Rényi entropy of order p of a quantum channel is not additive. The violations found are large; in all cases, the minimum output Rényi entropy of order p for a product channel need not be significantly greater than the minimum output entr...
Kolmogorov complexity and Shannon entropy are conceptually different measures. However, for any recursive probability distribution, the expected value of Kolmogorov complexity equals its Shannon entropy, up to a constant. We study if a similar relationship holds for Rényi and Tsallis entropies of order α, showing that it only holds for α = 1. Regarding a time-bounded analogue relationship, we s...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید