نتایج جستجو برای: weighted shannon entropy
تعداد نتایج: 171775 فیلتر نتایج به سال:
in this paper, we derive the exact analytical expressions for the shannon entropy of generalized orderstatistics from pareto-type and related distributions.
The differential Shannon entropy of information theory can change under a change of variables (coordinates), but the thermodynamic entropy of a physical system must be invariant under such a change. This difference is puzzling, because the Shannon and Gibbs entropies have the same functional form. We show that a canonical change of variables can, indeed, alter the spatial component of the therm...
The aim of this thesis is to formulate and prove quantum extensions of the famous Shannon-McMillan theorem and its stronger version due to Breiman. In ergodic theory the Shannon-McMillan-Breiman theorem is one of the fundamental limit theorems for classical discrete dynamical systems. It can be interpreted as a special case of the individual ergodic theorem. In this work, we consider spin latti...
There are numerous characterizations of Shannon entropy and Tsallis entropy as measures of information obeying certain properties. Using work by Faddeev and Furuichi, we derive a very simple characterization. Instead of focusing on the entropy of a probability measure on a finite set, this characterization focuses on the “information loss”, or change in entropy, associated with a measure-preser...
This paper demonstrates the performance of two possible CAT selection strategies for cognitive diagnosis. One is based on Shannon entropy and the other is based on Kullback-Leibler information. The performances of these two test construction methods are compared with random item selection. The cognitive diagnosis model used in this study is a simplified version of the Fusion model. Item banks a...
Shannon entropy was defined for probability distributions and then its using was expanded to measure the uncertainty of knowledge for systems with complete information. In this article, it is proposed to extend the using of Shannon entropy to under-defined or over-defined information systems. To be able to use Shannon entropy, the information is normalized by an affine transformation. The const...
Constraints on the entropy function are of fundamental importance in information theory. For a long time, the polymatroidal axioms, or equivalently the nonnegativity of the Shannon information measures, are the only known constraints. Inequalities that are implied by nonnegativity of the Shannon information measures are categorically referred to as Shannontype inequalities. If the number of ran...
As additivity is a characteristic property of the classical information measure, Shannon entropy, pseudo-additivity of the form x+qy = x+y+(1−q)xy is a characteristic property of Tsallis entropy. Rényi in [1] generalized Shannon entropy by means of Kolmogorov-Nagumo averages, by imposing additivity as a constraint. In this paper we show that there exists no generalization for Tsallis entropy, b...
Traditional detectors for spectrum sensing in cognitive radio networks always become disabled when noise uncertainty is severe. Shannon entropy-based detection methods have aroused widespread attention in recent years due to the characteristics of effective anti-noise uncertainty. However, in existing entropy-based sensing schemes, the uniform quantization method cannot guarantee the maximum en...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید