نتایج جستجو برای: conditional rényi entropy

تعداد نتایج: 128234  

2015
Sergio Verdú

Rényi entropy and Rényi divergence evidence a long track record of usefulness in information theory and its applications. Alfred Rényi never got around to generalizing mutual information in a similar way. In fact, in the literature there are several possible ways to accomplish such generalization, most notably those suggested by Suguru Arimoto, Imre Csiszár, and Robin Sibson. We collect several...

Journal: :IEEE Trans. Information Theory 2014
Christoph Bunte Amos Lapidoth

A task is randomly drawn from a finite set of tasks and is described using a fixed number of bits. All the tasks that share its description must be performed. Upper and lower bounds on the minimum ρ-th moment of the number of performed tasks are derived. The case where a sequence of tasks is produced by a source and n tasks are jointly described using nR bits is considered. If R is larger than ...

Journal: :IACR Cryptology ePrint Archive 2013
Mitsugu Iwamoto Junji Shikata

In this paper, information theoretic cryptography is discussed based on conditional Rényi entropies. Our discussion focuses not only on cryptography but also on the definitions of conditional Rényi entropies and the related information theoretic inequalities. First, we revisit conditional Rényi entropies, and clarify what kind of properties are required and actually satisfied. Then, we propose ...

Journal: :Inf. Sci. 2007
Ambedkar Dukkipati Shalabh Bhatnagar M. Narasimha Murty

The measure-theoretic definition of Kullback-Leibler relative-entropy (or simply KLentropy) plays a basic role in defining various classical information measures on general spaces. Entropy, mutual information and conditional forms of entropy can be expressed in terms of KL-entropy and hence properties of their measure-theoretic analogs will follow from those of measure-theoretic KL-entropy. The...

Journal: :Journal of High Energy Physics 2013

Journal: :Entropy 2016
Bernhard C. Geiger Gernot Kubin

The information loss in deterministic, memoryless systems is investigated by evaluating the conditional entropy of the input random variable given the output random variable. It is shown that for a large class of systems the information loss is finite, even if the input has a continuous distribution. For systems with infinite information loss, a relative measure is defined and shown to be relat...

2013
Bernhard C. Geiger Gernot Kubin

In this work the information loss in deterministic, memoryless systems is investigated by evaluating the conditional entropy of the input random variable given the output random variable. It is shown that for a large class of systems the information loss is finite, even if the input is continuously distributed. Based on this finiteness, the problem of perfectly reconstructing the input is addre...

Journal: :CoRR 2005
Ambedkar Dukkipati M. Narasimha Murty Shalabh Bhatnagar

By replacing linear averaging in Shannon entropy with Kolmogorov-Nagumo average (KN-averages) or quasilinear mean and further imposing the additivity constraint, Rényi proposed the first formal generalization of Shannon entropy. Using this recipe of Rényi, one can prepare only two information measures: Shannon and Rényi entropy. Indeed, using this formalism Rényi characterized these additive en...

Journal: :CoRR 2018
Igal Sason Sergio Verdú

This paper provides upper and lower bounds on the optimal guessing moments of a random variable taking values on a finite set when side information may be available. These moments quantify the number of guesses required for correctly identifying the unknown object and, similarly to Arikan’s bounds, they are expressed in terms of the Arimoto-Rényi conditional entropy. Although Arikan’s bounds ar...

2011
Ronald Cramer Serge Fehr

These lecture notes introduce some basic concepts from Shannon’s information theory, such as (conditional) Shannon entropy, mutual information, and Rényi entropy, as well as a number of basic results involving these notions. Subsequently, well-known bounds on perfectly secure encryption, source coding (i.e. data compression), and reliable communication over unreliable channels are discussed. We...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید