نتایج جستجو برای: weighted shannon entropy
تعداد نتایج: 171775 فیلتر نتایج به سال:
this paper concentrates on a new procedure which experimentally recognises gears and bearings faults of a typical gearbox system using a least square support vector machine (lssvm). two wavelet selection criteria maximum energy to shannon entropy ratio and maximum relative wavelet energy are used and compared to select an appropriate wavelet for feature extraction. the fault diagnosis method co...
Contents 10 Quantum Shannon Theory 1 10.1 Shannon for Dummies 2 10.1.1 Shannon entropy and data compression 2 10.1.2 Joint typicality, conditional entropy, and mutual information 6 10.1.3 Distributed source coding 8 10.1.4 The noisy channel coding theorem 9 10.2 Von Neumann Entropy 16 10.2.1 Mathematical properties of H(ρ) 18 10.2.2 Mixing, measurement, and entropy 20 10.2.3 Strong subadditivit...
The Kolmogorov-Sinai entropy is a far reaching dynamical generalization of Shannon entropy of information systems. This entropy works perfectly for probability measure preserving (p.m.p.) transformations. However, it is not useful when there is no finite invariant measure. There are certain successful extensions of the notion of entropy to infinite measure spaces, or transformations with ...
Shannon entropy [1] is one of fundamental quantities in classical information theory and uniquely determinded by the Shannon-Khinchin axiom or the Faddeev axiom. One-parameter extensions for Shannon entropy have been studied by many researchers. The Rényi entropy [2] and the Tsallis entropy [3] are famous. In the paper [4], the uniqueness theorem for the Tsallis entropy was proved. Also, in our...
One important issue in the theory of Ordered Weighted Averaging (OWA) operators is the determination of the associated weights. One of the first approaches, suggested by O’Hagan, determines a special class of OWA operators having maximal Shannon entropy of the OWA weights for a given level of orness; algorithmically it is based on the solution of a constrained optimization problem. In this pape...
Shannon entropy [1] is one of fundamental quantities in classical information theory and uniquely determinded by the Shannon-Khinchin axiom or the Faddeev axiom. Oneparameter extensions for Shannon entropy have been studied by many researchers [2]. The Rényi entropy [3] and the Tsallis entropy [4] are famous. In the paper [5], the uniqueness theorem for the Tsallis entropy was proven. See also ...
It is pointed out that the case for Shannon entropy and von Neumann entropy, as measures of uncertainty in quantum mechanics, is not as bleak as suggested in quant-ph/0006087. The main argument of the latter is based on one particular interpretation of Shannon’s H-function (related to consecutive measurements), and is shown explicitly to fail for other physical interpretations. Further, it is s...
In many life-testing and reliability studies, the experimenter might not always obtain complete information on failure times for all experimental units. One of the most common censoring schemes is progressive type-II censoring. The aim of this paper is characterizing the parent distributions based on Shannon entropy of progressive type-II censored order statistics. It is shown that the equality...
In this paper, we obtain the Rényi entropy rate for irreducible-aperiodic Markov chains with countable state space, using the theory of countable nonnegative matrices. We also obtain the bound for the rate of Rényi entropy of an irreducible Markov chain. Finally, we show that the bound for the Rényi entropy rate is the Shannon entropy rate.
5 Information 15 5.1 Information As Surprise . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16 5.2 Average Shannon Information . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18 5.3 How Surprised Should You Be? . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20 5.4 Entropy As Average Shannon Information . . . . . . . . . . . . . . . . . . . . . . 20 5.5 Dicing...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید