نتایج جستجو برای: shannon entropy numerical simulation

تعداد نتایج: 876050  

Journal: :Entropy 2015
Jie Zhu Jean-Jacques Bellanger Huazhong Shu Régine Le Bouquin-Jeannès

This paper deals with the estimation of transfer entropy based on the k-nearest neighbors (k-NN) method. To this end, we first investigate the estimation of Shannon entropy involving a rectangular neighboring region, as suggested in already existing literature, and develop two kinds of entropy estimators. Then, applying the widely-used error cancellation approach to these entropy estimators, we...

2004
Shigeru Furuichi

The uniequness theorem for the Tsallis entropy by introducing the generalized Faddeev’s axiom is proven. Our result improves the recent result, the uniqueness theorem for Tsallis entropy by the generalized Shannon-Khinchin’s axiom in [7], in the sence that our axiom is simpler than his one, as similar that Faddeev’s axiom is simpler than Shannon-Khinchin’s one.

Journal: :مدیریت صنعتی 0
علی محمدی دانشگاه شیراز نبی مولایی دانشگاه شیراز

rapid technological and economic growth over the last several decades has changed human lives and made modern society face complex decision making problems. these kinds of problems are characterized by incommensurate and conflicting criteria or objectives such as cost, reliability, performance, safety and productivity. multi criteria decision making is an approach that can be used to deal compl...

2008
Piotr Garbaczewski

Shannon information entropy is a natural measure of probability (de)localization and thus (un)predictability in various procedures of data analysis for model systems. We pay particular attention to links between the Shannon entropy and the related Fisher information notion, which jointly account for the shape and extension of continuous probability distributions. Classical, dynamical and random...

2011
CÉDRIC BERNARDIN CLAUDIO LANDIM

We examine the entropy of stationary nonequilibrium measures of boundary driven symmetric simple exclusion processes. In contrast with the Gibbs–Shannon entropy [1, 10], the entropy of nonequilibrium stationary states differs from the entropy of local equilibrium states.

2006
Ambedkar Dukkipati

Kullback-Leibler relative-entropy or KL-entropy of P with respect to R defined as ∫ X ln dP dR dP , where P and R are probability measures on a measurable space (X,M), plays a basic role in the definitions of classical information measures. It overcomes a shortcoming of Shannon entropy – discrete case definition of which cannot be extended to nondiscrete case naturally. Further, entropy and oth...

2005
Roman Frigg

Chaos is often explained in terms of random behaviour; and having positive Kolmogorov–Sinai entropy (KSE) is taken to be indicative of randomness. Although seemly plausible, the association of positive KSE with random behaviour needs justification since the definition of the KSE does not make reference to any notion that is connected to randomness. A common way of justifying this use of the KSE...

Journal: :IEEE Trans. Information Theory 1988
Anselm Blumer Robert J. McEliece

If optimality is measured by average codeword length, Huffman's algorithm gives optimal codes, and the redundancy can be measured as the difference between the average codeword length and Shannon's entropy. If the objective function is replaced by an exponentially weighted average, then a simple modification of Huffman's algorithm gives optimal codes. The redundancy can now be measured as the d...

Journal: :Physical chemistry chemical physics : PCCP 2010
Siamak Noorizadeh Ehsan Shakerzadeh

Based on the local Shannon entropy concept in information theory, a new measure of aromaticity is introduced. This index, which describes the probability of electronic charge distribution between atoms in a given ring, is called Shannon aromaticity (SA). Using B3LYP method and different basis sets (6-31G**, 6-31+G** and 6-311++G**), the SA values of some five-membered heterocycles, C(4)H(4)X, a...

Journal: :CoRR 2015
Boris Ryabko

Random and pseudorandom number generators (RNG and PRNG) are used for many purposes including cryptographic, modeling and simulation applications. For such applications a generated bit sequence should mimic true random, i.e., by definition, such a sequence could be interpreted as the result of the flips of a fair coin with sides that are labeled 0 and 1. It is known that the Shannon entropy of ...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید