نتایج جستجو برای: modified shannon entropy
تعداد نتایج: 321667 فیلتر نتایج به سال:
Information distance has become an important tool in a wide variety of applications. Various types of information distance have been made over the years. These information distance measures are different from entropy metric, as the former is based on Kolmogorov complexity and the latter on Shannon entropy. However, for any computable probability distributions, up to a constant, the expected val...
This paper studies the use of the Tsallis Entropy versus the classic BoltzmannGibbs-Shannon entropy for classifying image patterns. Given a database of 40 pattern classes, the goal is to determine the class of a given image sample. Our experiments show that the Tsallis entropy encoded in a feature vector for different q indices has great advantage over the Boltzmann-Gibbs-Shannon entropy for pa...
We consider the problem of finite sample corrections for entropy estimation. New estimates of the Shannon entropy are proposed and their systematic error (the bias) is computed analytically. We find that our results cover correction formulas of current entropy estimates recently discussed in literature. The trade-off between bias reduction and the increase of the corresponding statistical error...
Most assertions involving Shannon entropy have their Kolmogorov complexity counterparts. A general theorem of Romashchenko [4] states that every information inequality that is valid in Shannon’s theory is also valid in Kolmogorov’s theory, and vice verse. In this paper we prove that this is no longer true for ∀∃-assertions, exhibiting the first example where the formal analogy between Shannon e...
Abstract— Mammogram analysis usually refers to processing of mammograms with the goal of finding abnormality presented in the mammogram. Mammogram segmentation is one of the most critical tasks in automatic mammogram image analysis. Main purpose of mammogram segmentation is to segment suspicious regions by means of an adaptive threshold. In image processing, one of the most efficient techniques...
The uniequness theorem for the Tsallis entropy by introducing the generalized Faddeev’s axiom is proven. Our result improves the recent result, the uniqueness theorem for Tsallis entropy by the generalized Shannon-Khinchin’s axiom in [7], in the sence that our axiom is simpler than his one, as similar that Faddeev’s axiom is simpler than Shannon-Khinchin’s one.
Shannon information entropy is a natural measure of probability (de)localization and thus (un)predictability in various procedures of data analysis for model systems. We pay particular attention to links between the Shannon entropy and the related Fisher information notion, which jointly account for the shape and extension of continuous probability distributions. Classical, dynamical and random...
We examine the entropy of stationary nonequilibrium measures of boundary driven symmetric simple exclusion processes. In contrast with the Gibbs–Shannon entropy [1, 10], the entropy of nonequilibrium stationary states differs from the entropy of local equilibrium states.
Kullback-Leibler relative-entropy or KL-entropy of P with respect to R defined as ∫ X ln dP dR dP , where P and R are probability measures on a measurable space (X,M), plays a basic role in the definitions of classical information measures. It overcomes a shortcoming of Shannon entropy – discrete case definition of which cannot be extended to nondiscrete case naturally. Further, entropy and oth...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید