نتایج جستجو برای: weighted shannon entropy
تعداد نتایج: 171775 فیلتر نتایج به سال:
Shannon entropy H and related measures are increasingly used in molecular ecology and population genetics because (1) unlike measures based on heterozygosity or allele number, these measures weigh alleles in proportion to their population fraction, thus capturing a previously-ignored aspect of allele frequency distributions that may be important in many applications; (2) these measures connect ...
Image Segmentation plays an important role in medical field that enable professionals to detect their patient’s problems and help them to get proper diagnosed. In this article, an entropy based approach for image segmentation is discussed to highlight tumor in MRI images. Magnetic Resonance Imaging (MRI) is an imaging, in which pixel values are based on radiation absorption. In the proposed app...
Hyperspectral band image selection is a fundamental problem for hyperspectral remote sensing data processing. Accepting its importance, several information-based band selection methods have been proposed, which apply Shannon entropy to measure image information. However, the Shannon entropy is not accurate in measuring image information since it neglects the spatial distribution of pixels and i...
A relation between Shannon entropy and Kerridge inaccuracy, which is known as Shannon inequality, is well known in information theory. In this communication, first we generalized Shannon inequality and then given its application in coding theory.
High-throughput in vitro screening experiments can be used to generate concentration-response data for large chemical libraries. It is often desirable to estimate the concentration needed to achieve a particular effect, or potency, for each chemical tested in an assay. Potency estimates can be used to directly compare chemical profiles and prioritize compounds for confirmation studies, or emplo...
It was mentioned by Kolmogorov (1968, IEEE Trans. Inform. Theory 14, 662 664) that the properties of algorithmic complexity and Shannon entropy are similar. We investigate one aspect of this similarity. Namely, we are interested in linear inequalities that are valid for Shannon entropy and for Kolmogorov complexity. It turns out that (1) all linear inequalities that are valid for Kolmogorov com...
Methods for efficiently estimating Shannon entropy of data streams have important applications in learning, data mining, and network anomaly detections (e.g., the DDoS attacks). For nonnegative data streams, the method of Compressed Counting (CC) [11, 13] based on maximally-skewed stable random projections can provide accurate estimates of the Shannon entropy using small storage. However, CC is...
Information distance has become an important tool in a wide variety of applications. Various types of information distance have been made over the years. These information distance measures are different from entropy metric, as the former is based on Kolmogorov complexity and the latter on Shannon entropy. However, for any computable probability distributions, up to a constant, the expected val...
This paper studies the use of the Tsallis Entropy versus the classic BoltzmannGibbs-Shannon entropy for classifying image patterns. Given a database of 40 pattern classes, the goal is to determine the class of a given image sample. Our experiments show that the Tsallis entropy encoded in a feature vector for different q indices has great advantage over the Boltzmann-Gibbs-Shannon entropy for pa...
We consider the problem of finite sample corrections for entropy estimation. New estimates of the Shannon entropy are proposed and their systematic error (the bias) is computed analytically. We find that our results cover correction formulas of current entropy estimates recently discussed in literature. The trade-off between bias reduction and the increase of the corresponding statistical error...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید