نتایج جستجو برای: shannon entropy

تعداد نتایج: 72543  

In this paper, at first we derive a family of maximum Tsallis entropy distributions under optional side conditions on the mean income and the Gini index. Furthermore, corresponding with these distributions a family of Lorenz curves compatible with the optional side conditions is generated. Meanwhile, we show that our results reduce to Shannon entropy as $beta$ tends to one. Finally, by using ac...

2007
Ambedkar Dukkipati Shalabh Bhatnagar M. Narasimha Murty

Shannon entropy of a probability measure P , defined as − ∫ X dP dμ ln dP dμ dμ on a measure space (X,M, μ), is not a natural extension from the discrete case. However, maximum entropy (ME) prescriptions of Shannon entropy functional in the measure-theoretic case are consistent with those for the discrete case. Also it is well known that Kullback-Leibler relative entropy can be extended natural...

Journal: :CoRR 2018
Chun-Wang Ma Yu-Gang Ma

The general idea of information entropy provided by C.E. Shannon “hangs over everything we do” and can be applied to a great variety of problems once the connection between a distribution and the quantities of interest is found. The Shannon information entropy essentially quantify the information of a quantity with its specific distribution, for which the information entropy based methods have ...

Journal: :Entropy 2018
Peichao Gao Zhilin Li Hong Zhang

The quality of an image affects its utility and image quality assessment has been a hot research topic for many years. One widely used measure for image quality assessment is Shannon entropy, which has a well-established information-theoretic basis. The value of this entropy can be interpreted as the amount of information. However, Shannon entropy is badly adapted to information measurement in ...

Journal: :International Journal of Semantic Computing 2013

B. Afhami M. Madadi

In this paper, we derive the exact analytical expressions for the Shannon entropy of generalized orderstatistics from Pareto-type and related distributions.

2013
Jean Honorio Tommi S. Jaakkola

We provide a method that approximates the Bayes error rate and the Shannon entropy with high probability. The Bayes error rate approximation makes possible to build a classifier that polynomially approaches Bayes error rate. The Shannon entropy approximation provides provable performance guarantees for learning trees and Bayesian networks from continuous variables. Our results rely on some reas...

2004
V. Man ’ ko V. I. Man ’ ko Lebedev

Probability representation entropy (tomographic entropy) of arbitrary quantum state is introduced. Using the properties of spin tomogram to be standard probability distribution function the tomographic entropy notion is discussed. Relation of the tomographic entropy to Shannon entropy and von Neumann entropy is elucidated.

Journal: :Statistical applications in genetics and molecular biology 2011
Mariza de Andrade Xin Wang

In the past few years, several entropy-based tests have been proposed for testing either single SNP association or gene-gene interaction. These tests are mainly based on Shannon entropy and have higher statistical power when compared to standard χ2 tests. In this paper, we extend some of these tests using a more generalized entropy definition, Rényi entropy, where Shannon entropy is a special c...

Journal: :The Journal of chemical physics 2007
Shubin Liu

An analytical relationship between the densities of the Shannon entropy and Fisher information for atomic and molecular systems has been established in this work. Two equivalent forms of the Fisher information density are introduced as well. It is found that for electron densities of atoms and molecules the Shannon entropy density is intrinsically related to the electron density and the two for...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید