نتایج جستجو برای: shannon entropy

تعداد نتایج: 72543  

Journal: :Int. J. Math. Mathematical Sciences 2005
C. G. Chakrabarti Indranil Chakrabarty

We have presented a new axiomatic derivation of Shannon entropy for a discrete probability distribution on the basis of the postulates of additivity and concavity of the entropy function. We have then modified Shannon entropy to take account of observational uncertainty.The modified entropy reduces, in the limiting case, to the form of Shannon differential entropy. As an application, we have de...

Journal: :Entropy 2015
Qiaoning Yang Jianlin Wang

In actual application, sensors are prone to failure because of harsh environments, battery drain, and sensor aging. Sensor fault location is an important step for follow-up sensor fault detection. In this paper, two new multi-level wavelet Shannon entropies (multi-level wavelet time Shannon entropy and multi-level wavelet time-energy Shannon entropy) are defined. They take full advantage of sen...

Journal: :IEEE Trans. Information Theory 2002
Dan A. Simovici Szymon Jaroszewicz

The aim of this paper is to present an axiomatization of a generalization of Shannon’s entropy starting from partitions of finite sets. The proposed axiomatization yields as special cases the Havrda-Charvat entropy, and thus, provides axiomatizations for the Shannon entropy, the Gini index, and for other types of entropy used in classification and data mining. Keywords—Shannon entropy, Gini ind...

Journal: :IACR Cryptology ePrint Archive 2014
Maciej Skorski

We provide a new result that links two crucial entropy notions: Shannon Entropy H1 and collision entropy H2. Our formula gives the worst possible amount of collision entropy in a probability distribution, when its Shannon Entropy is fixed. Our results and techniques used in the proof immediately imply many quantitatively tight separations between Shannon and smooth Renyi entropy, which were pre...

Journal: :international journal of automotive engineering 0
z. baniamerian

this paper concentrates on a new procedure which experimentally recognises gears and bearings faults of a typical gearbox system using a least square support vector machine (lssvm). two wavelet selection criteria maximum energy to shannon entropy ratio and maximum relative wavelet energy are used and compared to select an appropriate wavelet for feature extraction. the fault diagnosis method co...

Journal: :international journal of automotive engineering 0
m. heidari h. homaei h. golestanian a heidari

this paper concentrates on a new procedure which experimentally recognises gears and bearings faults of a typical gearbox system using a least square support vector machine (lssvm). two wavelet selection criteria maximum energy to shannon entropy ratio and maximum relative wavelet energy are used and compared to select an appropriate wavelet for feature extraction. the fault diagnosis method co...

2016
John Preskill

Contents 10 Quantum Shannon Theory 1 10.1 Shannon for Dummies 2 10.1.1 Shannon entropy and data compression 2 10.1.2 Joint typicality, conditional entropy, and mutual information 6 10.1.3 Distributed source coding 8 10.1.4 The noisy channel coding theorem 9 10.2 Von Neumann Entropy 16 10.2.1 Mathematical properties of H(ρ) 18 10.2.2 Mixing, measurement, and entropy 20 10.2.3 Strong subadditivit...

The Kolmogorov-Sinai entropy is a far reaching dynamical generalization of Shannon entropy of information systems. This entropy works perfectly for probability measure preserving (p.m.p.) transformations. However, it is not useful when there is no finite invariant measure. There are certain successful extensions of the notion of entropy to infinite measure spaces, or transformations with ...

2008
Shigeru Furuichi

Shannon entropy [1] is one of fundamental quantities in classical information theory and uniquely determinded by the Shannon-Khinchin axiom or the Faddeev axiom. One-parameter extensions for Shannon entropy have been studied by many researchers. The Rényi entropy [2] and the Tsallis entropy [3] are famous. In the paper [4], the uniqueness theorem for the Tsallis entropy was proved. Also, in our...

Journal: :CoRR 2008
Shigeru Furuichi

Shannon entropy [1] is one of fundamental quantities in classical information theory and uniquely determinded by the Shannon-Khinchin axiom or the Faddeev axiom. Oneparameter extensions for Shannon entropy have been studied by many researchers [2]. The Rényi entropy [3] and the Tsallis entropy [4] are famous. In the paper [5], the uniqueness theorem for the Tsallis entropy was proven. See also ...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید