نتایج جستجو برای: shannon entropy

تعداد نتایج: 72543  

2015
Leila Golshani

In this paper, we show that in order to obtain the Tsallis entropy rate for stochastic processes, we can use the limit of conditional entropy, as it was done for the case of Shannon and Renyi entropy rates. Using that we can obtain Tsallis entropy rate for stationary Gaussian processes. Finally, we derive the relation between Renyi, Shannon and Tsallis entropy rates for stationary Gaussian proc...

Journal: :international journal of nonlinear analysis and applications 2013
b. afhami m. madadi

in this paper, we derive the exact analytical expressions for the shannon entropy of generalized orderstatistics from pareto-type and related distributions.

2010
Vladimir Hnizdo Michael K. Gilson

The differential Shannon entropy of information theory can change under a change of variables (coordinates), but the thermodynamic entropy of a physical system must be invariant under such a change. This difference is puzzling, because the Shannon and Gibbs entropies have the same functional form. We show that a canonical change of variables can, indeed, alter the spatial component of the therm...

2004
Arleta Szkola

The aim of this thesis is to formulate and prove quantum extensions of the famous Shannon-McMillan theorem and its stronger version due to Breiman. In ergodic theory the Shannon-McMillan-Breiman theorem is one of the fundamental limit theorems for classical discrete dynamical systems. It can be interpreted as a special case of the individual ergodic theorem. In this work, we consider spin latti...

Journal: :Entropy 2011
John C. Baez Tobias Fritz Tom Leinster

There are numerous characterizations of Shannon entropy and Tsallis entropy as measures of information obeying certain properties. Using work by Faddeev and Furuichi, we derive a very simple characterization. Instead of focusing on the entropy of a probability measure on a finite set, this characterization focuses on the “information loss”, or change in entropy, associated with a measure-preser...

2003
Xueli Xu

This paper demonstrates the performance of two possible CAT selection strategies for cognitive diagnosis. One is based on Shannon entropy and the other is based on Kullback-Leibler information. The performances of these two test construction methods are compared with random item selection. The cognitive diagnosis model used in this study is a simplified version of the Fusion model. Item banks a...

Journal: :CoRR 2017
Vasile Patrascu

Shannon entropy was defined for probability distributions and then its using was expanded to measure the uncertainty of knowledge for systems with complete information. In this article, it is proposed to extend the using of Shannon entropy to under-defined or over-defined information systems. To be able to use Shannon entropy, the information is normalized by an affine transformation. The const...

2010
Raymond W. Yeung

Constraints on the entropy function are of fundamental importance in information theory. For a long time, the polymatroidal axioms, or equivalently the nonnegativity of the Shannon information measures, are the only known constraints. Inequalities that are implied by nonnegativity of the Shannon information measures are categorically referred to as Shannontype inequalities. If the number of ran...

2008
Ambedkar Dukkipati Shalabh Bhatnagar

As additivity is a characteristic property of the classical information measure, Shannon entropy, pseudo-additivity of the form x+qy = x+y+(1−q)xy is a characteristic property of Tsallis entropy. Rényi in [1] generalized Shannon entropy by means of Kolmogorov-Nagumo averages, by imposing additivity as a constraint. In this paper we show that there exists no generalization for Tsallis entropy, b...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید