نتایج جستجو برای: shannon entropy

تعداد نتایج: 72543  

The purpose of this study is to define the concepts of Tsallis entropy and conditional Tsallis entropy of fuzzy partitions and to obtain some results concerning this kind entropy. We show that the Tsallis entropy of fuzzy partitions has the subadditivity and concavity properties. We study this information measure under the refinement and zero mode subset relations. We check the chain rules for ...

Journal: :WSEAS TRANSACTIONS ON SYSTEMS 2020

Journal: :Entropy 2014
Martin Pfleger Thomas Wallek Andreas Pfennig

Thermodynamic modeling of extensive systems usually implicitly assumes the additivity of entropy. Furthermore, if this modeling is based on the concept of Shannon entropy, additivity of the latter function must also be guaranteed. In this case, the constituents of a thermodynamic system are treated as subsystems of a compound system, and the Shannon entropy of the compound system must be subjec...

Journal: :Axioms 2017
Sonja Jäckle Karsten Keller

The Tsallis entropy given for a positive parameter α can be considered as a generalization of the classical Shannon entropy. For the latter, corresponding to α = 1, there exist many axiomatic characterizations. One of them based on the well-known Khinchin-Shannon axioms has been simplified several times and adapted to Tsallis entropy, where the axiom of (generalized) Shannon additivity is playi...

Journal: :CoRR 2015
Hélio Magalhães de Oliveira

This paper reports a new reading for wavelets, which is based on the classical ’De Broglie’ principle. The waveparticle duality principle is adapted to wavelets. Every continuous basic wavelet is associated with a proper probability density, allowing defining the Shannon entropy of a wavelet. Further entropy definitions are considered, such as Jumarie or Renyi entropy of wavelets. We proved tha...

Shannon entropy is increasingly used in many applications. In this article, an estimator of the entropy of a continuous random variable is proposed. Consistency and scale invariance of variance and mean squared error of the proposed estimator is proved and then comparisons are made with Vasicek's (1976), van Es (1992), Ebrahimi et al. (1994) and Correa (1995) entropy estimators. A simulation st...

Journal: :CoRR 2014
Jaejun Lee Taeseon Yun

This paper suggests an effective method for facial recognition using fuzzy theory and Shannon entropy. Combination of fuzzy theory and Shannon entropy eliminates the complication of other methods. Shannon entropy calculates the ratio of an element between faces, and fuzzy theory calculates the membership of the entropy with 1. More details will be mentioned in Section 3. The learning performanc...

2015
Ahmad Beirami Robert Calderbank Ken Duffy Muriel Médard

Guesswork forms the mathematical framework for quantifying computational security subject to brute-force determination by query. In this paper, we consider guesswork subject to a per-symbol Shannon entropy budget. We introduce inscrutability rate to quantify the asymptotic difficulty of guessing U out of V secret strings drawn from the string-source and prove that the inscrutability rate of any...

, H. Homaei, H. Golestanian, M. Heidari,

This paper concentrates on a new procedure which experimentally recognises gears and bearings faults of a typical gearbox system using a least square support vector machine (LSSVM). Two wavelet selection criteria Maximum Energy to Shannon Entropy ratio and Maximum Relative Wavelet Energy are used and compared to select an appropriate wavelet for feature extraction. The fault diagnosis method co...

2012
Yusuf SEVİM Ayten ATASOY

Most independent component analysis (ICA) algorithms use mutual information (MI) measures based on Shannon entropy as a cost function, but Shannon entropy is not the only measure in the literature. In this paper, instead of Shannon entropy, Tsallis entropy is used and a novel ICA algorithm, which uses kernel density estimation (KDE) for estimation of source distributions, is proposed. KDE is di...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید