نتایج جستجو برای: Shannon entropy, Numerical simulation

تعداد نتایج: 876050  

Shannon entropy is increasingly used in many applications. In this article, an estimator of the entropy of a continuous random variable is proposed. Consistency and scale invariance of variance and mean squared error of the proposed estimator is proved and then comparisons are made with Vasicek's (1976), van Es (1992), Ebrahimi et al. (1994) and Correa (1995) entropy estimators. A simulation st...

Journal: :international journal of mathematical modelling and computations 0
m. khodabin

in this paper, the ambiguity of nite state irreducible markov chain trajectories is reminded and is obtained for two state markov chain. i give an applicable example of this concept in president election

, M. Moazamnia S. Sadeghfam Y. Hassanzadeh

Practical concept of velocity distribution of pressure flow in the bends is interesting and hence, the professional engineering design has been investigated in the current study. This paper shows that velocity distribution in the bends can be analyzed in terms of the probability distributions. The concept of entropy based on the probability is an applied and new approach to achieve velocity pro...

Journal: :IACR Cryptology ePrint Archive 2014
Maciej Skorski

We provide a new result that links two crucial entropy notions: Shannon Entropy H1 and collision entropy H2. Our formula gives the worst possible amount of collision entropy in a probability distribution, when its Shannon Entropy is fixed. Our results and techniques used in the proof immediately imply many quantitatively tight separations between Shannon and smooth Renyi entropy, which were pre...

M. Abbasnejad, M. Tavakoli, N. R. Arghami,

In this paper, we introduce a goodness of fit test for expo- nentiality based on Lin-Wong divergence measure. In order to estimate the divergence, we use a method similar to Vasicek’s method for estimat- ing the Shannon entropy. The critical values and the powers of the test are computed by Monte Carlo simulation. It is shown that the proposed test are competitive with other tests of exponentia...

Journal: :Entropy 2013
Marco Bee

In this paper we propose an approach to the estimation and simulation of loss distributions based on Maximum Entropy (ME), a non-parametric technique that maximizes the Shannon entropy of the data under moment constraints. Special cases of the ME density correspond to standard distributions; therefore, this methodology is very general as it nests most classical parametric approaches. Sampling t...

2012
T. Holden G. Tremberger E. Cheung R. Subramaniam R. Sullivan N. Gadura P. Schneider P. Marchese A. Flamholz T. Cheung D. Lieberman

A nucleotide sequence can be expressed as a numerical sequence when each nucleotide is assigned its proton number. A resulting gene numerical sequence can be investigated for its fractal dimension in terms of evolution and chemical properties for comparative studies. We have investigated such nucleotide fluctuation in the 16S rRNA gene of archaea thermophiles. The studied archaea thermophiles w...

2003
Xueli Xu

This paper demonstrates the performance of two possible CAT selection strategies for cognitive diagnosis. One is based on Shannon entropy and the other is based on Kullback-Leibler information. The performances of these two test construction methods are compared with random item selection. The cognitive diagnosis model used in this study is a simplified version of the Fusion model. Item banks a...

Journal: :The Journal of chemical physics 2007
Shubin Liu

An analytical relationship between the densities of the Shannon entropy and Fisher information for atomic and molecular systems has been established in this work. Two equivalent forms of the Fisher information density are introduced as well. It is found that for electron densities of atoms and molecules the Shannon entropy density is intrinsically related to the electron density and the two for...

The Rényi entropy is a generalization of Shannon entropy to a one-parameter family of entropies. Tsallis entropy too is a generalization of Shannon entropy. The measure for Tsallis entropy is non-logarithmic. After the introduction of Shannon entropy , the conditional Shannon entropy was derived and its properties became known. Also, for Tsallis entropy, the conditional entropy was introduced a...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید