Shannon's entropy revisited
نویسنده
چکیده
I consider the effect of a finite sample size on the entropy of a sample of independent events. I propose formula for entropy which satisfies Shannon’s axioms, and which reduces to Shannon’s entropy when sample size is infinite. I discuss the physical meaning of the difference between two formulas, including some practical implications, such as maximum achievable channel utilization, and minimum achievable communication protocol overhead, for a given message size. The classic formula (Shannon, 1948) (1) is accepted as a measure of entropy for a sample of independent events. Other measures, such as (Renyi, 1960) and (Tsallis, 1988) entropies, and more (Sharma B.D., 1975; Masi, 2005; Gorban, 2002) have been proposed for situations when events are correlated, as e.g. in systems with long-range interactions. These alternative measures, generally, are expected to have properties (Wehrl, 1978), different from Shannon’s entropy: non-additivity, convexity, etc., while converging to (1) when correlations disappear. The underlying assumption is that when events are independent formula (1) still gives the correct measure. Here I take a second look at the original premise that formula (1) and its equivalents in statistical (Landau, 1980) and quantum (Neumann, 1955) mechanics are the correct measures of entropy in the absence of correlations. I show that if the sample size is finite there is an inherently smaller amount of information which can be encoded per event, than given by Shannon’s formula. I propose a formula for entropy which takes into account the sample size and which reduces to (1) when sample size is infinite. Consider independent events, with possible outcomes for each event, having probabilities . Independence of events implies only the frequencies of events describe the sample, while the order of events is irrelevant. The probability density function for the sample is then given by multinomial distribution:
منابع مشابه
AN APPLICATION OF TRAJECTORIES AMBIGUITY IN TWO-STATE MARKOV CHAIN
In this paper, the ambiguity of nite state irreducible Markov chain trajectories is reminded and is obtained for two state Markov chain. I give an applicable example of this concept in President election
متن کاملA New Definition of the Entropy of General Probability Distributions Based on the Non-Standard Analysis
Based on the non-standard analysis by Robinson, a new definition of the entropy of general distributions is given as a straightforward extension of Shannon's discrete entropy. The newly defined entropy has natural properties such as positiveness and invariance under transformations unlike Shannon's continuous entropy. In the light of this new definition, the meaning of Shannon's continuous entr...
متن کاملGroundwater qanat potential mapping using frequency ratio and Shannon's entropy models in the Moghan watershed, Iran
The purpose of current study is to produce groundwater qanat potential map using frequency ratio (FR) and Shannon's entropy (SE) models in the Moghan watershed, Khorasan Razavi Province, Iran. The qanat is basically a horizontal, interconnected series of underground tunnels that accumulate and deliver groundwater from a mountainous source district, along a waterbearing formation (aquifer), and ...
متن کاملAn Analysis of Ministry of Education’s Strategic Plans Based on Favorable Components of English Language Teaching Using Shannon’s Entropy
The present research aims to analyze the content of Ministry of Education’s strategic plans (the Fundamental Reform Document of Education, the Comprehensive National Scientific Plan and the National Curriculum Document) based on Shannon's entropy regarding the favorable components of teaching English. The contents of the Fundamental Reform Document of Education, the Comprehensive National Scien...
متن کاملA New Interpretation of the Shannon Entropy Measure
Although more than sixty years have elapsed since Shannon's seminal information entropy paper the literature reveals that there are divergent opinions of what it actually measures. From its similarity to Boltzmann entropy in statistical mechanics, the most common view is that it measures information disorder and uncertainty. Based on an inductive derivation of the expression we propose a new in...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- CoRR
دوره abs/1504.01407 شماره
صفحات -
تاریخ انتشار 2015