نتایج جستجو برای: rényi entropy rate
تعداد نتایج: 1024701 فیلتر نتایج به سال:
Properties of scalar quantization with rth power distortion and constrained Rényi entropy of order α ∈ (0, 1) are investigated. For an asymptotically (high-rate) optimal sequence of quantizers, the contribution to the Rényi entropy due to source values in a fixed interval is identified in terms of the “entropy density” of the quantizer sequence. This extends results related to the well-known po...
Two different distributions may have equal Rényi entropy; thus a distribution cannot be identified by its Rényi entropy. In this paper, we explore properties of the Rényi entropy of order statistics. Several characterizations are established based on the Rényi entropy of order statistics and record values. These include characterizations of a distribution on the basis of the differences between...
This paper introduces “swiveled Rényi entropies” as an alternative to the Rényi entropic quantities put forward in [Berta et al., Physical Review A 91, 022333 (2015)]. What distinguishes the swiveled Rényi entropies from the prior proposal of Berta et al. is that there is an extra degree of freedom: an optimization over unitary rotations with respect to particular fixed bases (swivels). A conse...
A hidden Markov process (HMP) is a discrete-time finite-state homogeneous Markov chain observed through a discrete-time memoryless invariant channel. The non-asymptotic equipartition property (NEP) is a bound on the probability of the sample entropy deviating from the entropy rate of a stochastic process, so it can be viewed as a refinement of Shannon-McMillan-Breiman theorem. In this report, w...
Rényi divergence is related to Rényi entropy much like information divergence (also called Kullback-Leibler divergence or relative entropy) is related to Shannon’s entropy, and comes up in many settings. It was introduced by Rényi as a measure of information that satisfies almost the same axioms as information divergence. We review the most important properties of Rényi divergence, including it...
We consider optimal scalar quantization with rth power distortion and constrained Rényi entropy of order α. For sources with absolutely continuous distributions the high rate asymptotics of the quantizer distortion has long been known for α = 0 (fixed-rate quantization) and α = 1 (entropyconstrained quantization). These results have recently been extended to quantization with Rényi entropy cons...
Rényi entropy and Rényi divergence evidence a long track record of usefulness in information theory and its applications. Alfred Rényi never got around to generalizing mutual information in a similar way. In fact, in the literature there are several possible ways to accomplish such generalization, most notably those suggested by Suguru Arimoto, Imre Csiszár, and Robin Sibson. We collect several...
A task is randomly drawn from a finite set of tasks and is described using a fixed number of bits. All the tasks that share its description must be performed. Upper and lower bounds on the minimum ρ-th moment of the number of performed tasks are derived. The case where a sequence of tasks is produced by a source and n tasks are jointly described using nR bits is considered. If R is larger than ...
Rényi entropy is an information-theoretic measure of randomness which is fundamental to several applications. Several estimators of Rényi entropy based on k-nearest neighbor (kNN) based distances have been proposed in literature. For d-dimensional densities f , the variance of these Rényi entropy estimators of f decay as O(M), whereM is the sample size drawn from f . On the other hand, the bias...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید