نتایج جستجو برای: and 054 disregarding shannon entropy
تعداد نتایج: 16840017 فیلتر نتایج به سال:
The differential Shannon entropy of information theory can change under a change of variables (coordinates), but the thermodynamic entropy of a physical system must be invariant under such a change. This difference is puzzling, because the Shannon and Gibbs entropies have the same functional form. We show that a canonical change of variables can, indeed, alter the spatial component of the therm...
There are numerous characterizations of Shannon entropy and Tsallis entropy as measures of information obeying certain properties. Using work by Faddeev and Furuichi, we derive a very simple characterization. Instead of focusing on the entropy of a probability measure on a finite set, this characterization focuses on the “information loss”, or change in entropy, associated with a measure-preser...
The aim of this thesis is to formulate and prove quantum extensions of the famous Shannon-McMillan theorem and its stronger version due to Breiman. In ergodic theory the Shannon-McMillan-Breiman theorem is one of the fundamental limit theorems for classical discrete dynamical systems. It can be interpreted as a special case of the individual ergodic theorem. In this work, we consider spin latti...
Shannon entropy was defined for probability distributions and then its using was expanded to measure the uncertainty of knowledge for systems with complete information. In this article, it is proposed to extend the using of Shannon entropy to under-defined or over-defined information systems. To be able to use Shannon entropy, the information is normalized by an affine transformation. The const...
This paper demonstrates the performance of two possible CAT selection strategies for cognitive diagnosis. One is based on Shannon entropy and the other is based on Kullback-Leibler information. The performances of these two test construction methods are compared with random item selection. The cognitive diagnosis model used in this study is a simplified version of the Fusion model. Item banks a...
Constraints on the entropy function are of fundamental importance in information theory. For a long time, the polymatroidal axioms, or equivalently the nonnegativity of the Shannon information measures, are the only known constraints. Inequalities that are implied by nonnegativity of the Shannon information measures are categorically referred to as Shannontype inequalities. If the number of ran...
Traditional detectors for spectrum sensing in cognitive radio networks always become disabled when noise uncertainty is severe. Shannon entropy-based detection methods have aroused widespread attention in recent years due to the characteristics of effective anti-noise uncertainty. However, in existing entropy-based sensing schemes, the uniform quantization method cannot guarantee the maximum en...
Relations between Shannon entropy and Rényi entropies of integer order are discussed. For any N–point discrete probability distribution for which the Rényi entropies of order two and three are known, we provide an lower and an upper bound for the Shannon entropy. The average of both bounds provide an explicit extrapolation for this quantity. These results imply relations between the von Neumann...
A new set of time-dependent deterministic sampling (TDDS) measures, based on local Shannon entropy, are presented to adaptively gauge the importance of various regions on a potential energy surface and to be employed in "on-the-fly" quantum dynamics. Shannon sampling and Shannon entropy are known constructs that have been used to analyze the information content in functions: for example, time-s...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید