نتایج جستجو برای: shannon entropy

تعداد نتایج: 72543  

Journal: :Open Syst. Inform. Dynam. 2017
Fabio Benatti Samad Khabbazi Oskouei Ahmad Shafiei Deh Abad

We study the relations between the recently proposed machine-independent quantum complexity of P. Gacs [1] and the entropy of classical and quantum systems. On one hand, by restricting Gacs complexity to ergodic classical dynamical systems, we retrieve the equality between the Kolmogorov complexity rate and the Shannon entropy rate derived by A.A. Brudno [2]. On the other hand, using the quantu...

Journal: :Chaos 2007
H Rabarimanantsoa L Achour C Letellier A Cuvelier J-F Muir

Recurrence plots were introduced to quantify the recurrence properties of chaotic dynamics. Hereafter, the recurrence quantification analysis was introduced to transform graphical interpretations into statistical analysis. In this spirit, a new definition for the Shannon entropy was recently introduced in order to have a measure correlated with the largest Lyapunov exponent. Recurrence plots an...

Journal: :CoRR 2010
Ping Li

Abstract The long-standing problem of Shannon entropy estimation in data streams (assuming the strict Turnstile model) is now an easy task by using the technique proposed in this paper. Essentially speaking, in order to estimate the Shannon entropy with a guaranteed ν-additive accuracy, it suffices to estimate the αth frequency moment, where α = 1−∆, with a guaranteed ǫ-multiplicative accuracy,...

Journal: :Neuro endocrinology letters 2013
Taiki Takahashi

Connections between information theory and decision under uncertainty have been attracting attention in econophysics, neuroeconomics and quantum decision theory. This paper proposes a psychophysical theory of Shannon entropy based on a mathematical equivalence of delay and uncertainty in decision-making, and psychophysics of the perception of waiting time in probabilistic choices. Furthermore, ...

2013
Edin Mulalić Miomir Stanković Radomir Stanković

The Tsallis entropy was proposed as a possible generalization of the standard Boltzmann-Gibbs-Shannon (BGS) entropy as a concept aimed at efficient characterisation of non-extensive complex systems. Ever since its introduction [1], it has been successfully applied in various fields [2]. In parallel, there have been numerous attempts to provide its formal derivation from an axiomatic foundation,...

Journal: :CoRR 2016
Nithin Nagaraj Karthi Balasubramanian

Shannon Entropy has been extensively used for characterizing complexity of time series arising from chaotic dynamical systems and stochastic processes such as Markov chains. However, for short and noisy time series, Shannon entropy performs poorly. Complexity measures which are based on lossless compression algorithms are a good substitute in such scenarios. We evaluate the performance of two s...

2009
Ping Li

Compressed Counting (CC) [22] was recently proposed for estimating the αth frequency moments of data streams, where 0 < α ≤ 2. CC can be used for estimating Shannon entropy, which can be approximated by certain functions of the αth frequency moments as α → 1. Monitoring Shannon entropy for anomaly detection (e.g., DDoS attacks) in large networks is an important task. This paper presents a new a...

2007
Danielle Sent Linda C. van der Gaag

In diagnostic decision-support systems, test selection amounts to selecting, in a sequential manner, a test that is expected to yield the largest decrease in the uncertainty about a patient’s diagnosis. For capturing this uncertainty, often an information measure is used. In this paper, we study the Shannon entropy, the Gini index, and the misclassification error for this purpose. We argue that...

2009
Marcelo R. Ubriaco

We propose entropy functions based on fractional calculus. We show that this new entropy has the same properties than the Shannon entropy except that of additivity, therefore making this entropy non-extensive. We show that this entropy function satisfies the Lesche and thermodynamic stability criteria.

2003
B. H. Lavenda

The Tsallis entropy is shown to be an additive entropy of degree-q that information scientists have been using for almost forty years. Neither is it a unique solution to the nonadditive functional equation from which random entropies are derived. Notions of additivity, extensivity and homogeneity are clarified. The relation between mean code lengths in coding theory and various expressions for ...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید