نتایج جستجو برای: information entropy

تعداد نتایج: 1203337  

1997
Vincent van de Laar W. Bastiaan Kleijn Ed F. Deprettere

We estimated the perceptual entropy rate of the phonemes of American English and found that the upper limit of the perceptual entropy of voiced phonemes is approximately 1.4 bit/sample, whereas the perceptual entropy of unvoiced phonemes is approximately 0.9 bit/sample. Results indicate that a simple voiced/unvoiced classi cation is suboptimal when trying to minimize bit rate. We used two di er...

2008
Shigeru Furuichi

Shannon entropy [1] is one of fundamental quantities in classical information theory and uniquely determinded by the Shannon-Khinchin axiom or the Faddeev axiom. One-parameter extensions for Shannon entropy have been studied by many researchers. The Rényi entropy [2] and the Tsallis entropy [3] are famous. In the paper [4], the uniqueness theorem for the Tsallis entropy was proved. Also, in our...

Journal: :CoRR 2008
Shigeru Furuichi

Shannon entropy [1] is one of fundamental quantities in classical information theory and uniquely determinded by the Shannon-Khinchin axiom or the Faddeev axiom. Oneparameter extensions for Shannon entropy have been studied by many researchers [2]. The Rényi entropy [3] and the Tsallis entropy [4] are famous. In the paper [5], the uniqueness theorem for the Tsallis entropy was proven. See also ...

Journal: :Entropy 2017
Vijay P. Singh Bellie Sivakumar Huijuan Cui

Water engineering is an amalgam of engineering (e.g., hydraulics, hydrology, irrigation, ecosystems, environment, water resources) and non-engineering (e.g., social, economic, political) aspects that are needed for planning, designing and managing water systems. These aspects and the associated issues have been dealt with in the literature using different techniques that are based on different ...

2008
Oliver Johnson Christophe Vignat

English abstract: We consider the Student-t and Student-r distributions, which maximise Rényi entropy under a covariance condition. We show that they have information-theoretic properties which mirror those of the Gaussian distributions, which maximise Shannon entropy under the same condition. We introduce a convolution which preserves the Rényi maximising family, and show that the Rényi maximi...

2008
Ambedkar Dukkipati Shalabh Bhatnagar

As additivity is a characteristic property of the classical information measure, Shannon entropy, pseudo-additivity of the form x+qy = x+y+(1−q)xy is a characteristic property of Tsallis entropy. Rényi in [1] generalized Shannon entropy by means of Kolmogorov-Nagumo averages, by imposing additivity as a constraint. In this paper we show that there exists no generalization for Tsallis entropy, b...

Journal: :Journal of Physics A: Mathematical and General 2000

Journal: :Journal of Geographical Systems 2014

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید