نتایج جستجو برای: and 054 disregarding shannon entropy
تعداد نتایج: 16840017 فیلتر نتایج به سال:
This manual describes the Fortran 90 implementation of maximum-entropy basis functions. The main ingredients of the theory are presented, and then the numerical implementation is touched upon. Instructions on the installation and execution of the code, as well as on writing an interface to the library are presented. Each program module and the most important functions in each module are discuss...
Abstract The long-standing problem of Shannon entropy estimation in data streams (assuming the strict Turnstile model) is now an easy task by using the technique proposed in this paper. Essentially speaking, in order to estimate the Shannon entropy with a guaranteed ν-additive accuracy, it suffices to estimate the αth frequency moment, where α = 1−∆, with a guaranteed ǫ-multiplicative accuracy,...
We consider the problem of approximating the empirical Shannon entropy of a highfrequency data stream under the relaxed strict-turnstile model, when space limitations make exact computation infeasible. An equivalent measure of entropy is the Rényi entropy that depends on a constant α. This quantity can be estimated efficiently and unbiasedly from a low-dimensional synopsis called an α-stable da...
Compressed Counting (CC) [22] was recently proposed for estimating the αth frequency moments of data streams, where 0 < α ≤ 2. CC can be used for estimating Shannon entropy, which can be approximated by certain functions of the αth frequency moments as α → 1. Monitoring Shannon entropy for anomaly detection (e.g., DDoS attacks) in large networks is an important task. This paper presents a new a...
By combining the explicit formula of the Shannon informational entropy ) , ( Y X H for two random variables X and Y , with the entropy )) ( ( X f H of ) (X f where (.) f is a real-valued differentiable function, we have shown that the density of the amount of information in Shannon sense involved in a non-random differentiable function is defined by the logarithm of the absolute value of its de...
The weak law of large numbers implies that, under mild assumptions on the source, the Renyi entropy per produced symbol converges (in probability) towards the Shannon entropy rate. This paper quantifies the speed of this convergence for sources with independent (but not iid) outputs, generalizing and improving the result of Holenstein and Renner (IEEE Trans. Inform. Theory, 2011). (a) we charac...
The distribution functions of codon usage probabilities, computed over all the available GenBank data for 40 eukaryotic biological species and five chloroplasts, are best fitted by the sum of a constant, an exponential, and a linear function in the rank of usage. For mitochondria the analysis is not conclusive. These functions are characterized by parameters that strongly depend on the total gu...
Beyond the local constraints imposed by grammar, words concatenated in long sequences carrying a complex message show statistical regularities that may reflect their linguistic role in the message. In this paper, we perform a systematic statistical analysis of the use of words in literary English corpora. We show that there is a quantitative relation between the role of content words in literar...
Shannon entropy is the most crucial foundation of Information Theory, which has been proven to be effective in many fields such as communications. Rényi entropy and Chernoff information are other two popular measures of information with wide applications. The mutual information is effective to measure the channel information for the fact that it reflects the relation between output variables an...
We consider the problem of approximating the empirical Shannon entropy of a high-frequency data stream when space limitations make exact computation infeasible. It is known that αdependent quantities such as the Rényi and Tsallis entropies can be estimated efficiently and unbiasedly from low-dimensional α-stable data sketches. An approximation to the Shannon entropy can be obtained from either ...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید