نتایج جستجو برای: conditional rényi entropy
تعداد نتایج: 128234 فیلتر نتایج به سال:
We study Rényi entropy of locally excited states with considering the thermal and boundary effects respectively in two dimensional conformal field theories (CFTs). Firstly, we consider locally excited states obtained by acting primary operators on a thermal state in low temperature limit. The Rényi entropy is summation of contribution from thermal effect and local excitation. Secondly, we mainl...
In this paper, we introduce an assumption which makes it possible to extend the learning ability of discriminative model to unsupervised setting. We propose an informationtheoretic framework as an implementation of the low-density separation assumption. The proposed framework provides a unified perspective of Maximum Margin Clustering (MMC), Discriminative k -means, Spectral Clustering and Unsu...
English abstract: We consider the Student-t and Student-r distributions, which maximise Rényi entropy under a covariance condition. We show that they have information-theoretic properties which mirror those of the Gaussian distributions, which maximise Shannon entropy under the same condition. We introduce a convolution which preserves the Rényi maximising family, and show that the Rényi maximi...
We consider the generalized differential entropy of normalized sums of independent and identically distributed (IID) continuous random variables. We prove that the Rényi entropy and Tsallis entropy of order α (α > 0) of the normalized sum of IID continuous random variables with bounded moments are convergent to the corresponding Rényi entropy and Tsallis entropy of the Gaussian limit, and obtai...
Using a sharp version of the reverse Young inequality, and a Rényi entropy comparison result due to Fradelizi, Madiman, and Wang, the authors are able to derive a Rényi entropy power inequality for log-concave random vectors when Rényi parameters belong to (0, 1). Furthermore, the estimates are shown to be somewhat sharp.
We show that a recent definition of relative Rényi entropy is monotone under completely positive, trace preserving maps. This proves a recent conjecture of Müller–Lennert et al. Recently, Müller–Lennert et al. [12] and Wilde et al. [15] modified the traditional notion of relative Rényi entropy and showed that their new definition has several desirable properties of a relative entropy. One of th...
Entanglement Rényi-α entropy is an entanglement measure. It reduces to the standard entanglement of formation when α tends to 1. We derive analytical lower and upper bounds for the entanglement Rényi-α entropy of arbitrary dimensional bipartite quantum systems. We also demonstrate the application our bound for some concrete examples. Moreover, we establish the relation between entanglement Rény...
We associate to the p-th Rényi entropy a definition of entropy power, which is the natural extension of Shannon’s entropy power and exhibits a nice behaviour along solutions to the p-nonlinear heat equation in Rn. We show that the Rényi entropy power of general probability densities solving such equations is always a concave function of time, whereas it has a linear behaviour in correspondence ...
Entropy is well known to be Schur concave on finite alphabets. Recently, the authors have strengthened the result by showing that for any pair of probability distributions P and Q with Q majorized by P , the entropy of Q is larger than the entropy of P by the amount of relative entropy D(P ||Q). This result applies to P and Q defined on countable alphabets. This paper shows the counterpart of t...
We show that a recent definition of relative Rényi entropy is monotone under completely positive, trace preserving maps. This proves a recent conjecture of Müller–Lennert et al. Recently, Müller–Lennert et al. [12] and Wilde et al. [15] modified the traditional notion of relative Rényi entropy and showed that their new definition has several desirable properties of a relative entropy. One of th...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید