نتایج جستجو برای: shannon entropy

تعداد نتایج: 72543  

2015
P. K Bhatia Surender Singh

After generalization of Shannon‘s entropy measure by Renyi in 1961, many generalized versions of Shannon measure were proposed by different authors. Shannon measure can be obtained from these generalized measures asymptotically. A natural question arises in the parametric generalization of Shannon‘s entropy measure. What is the role of the parameter(s) from application point of view? In the pre...

Journal: :CoRR 2016
Anna Choromanska Krzysztof Choromanski Mariusz Bojarski

We analyze the performance of the top-down multiclass classification algorithm for decision tree learning called LOMtree, recently proposed in the literature Choromanska and Langford (2014) for solving efficiently classification problems with very large number of classes. The algorithm online optimizes the objective function which simultaneously controls the depth of the tree and its statistica...

2014
N. Sukumar

This manual describes the Fortran 90 implementation of maximum-entropy basis functions. The main ingredients of the theory are presented, and then the numerical implementation is touched upon. Instructions on the installation and execution of the code, as well as on writing an interface to the library are presented. Each program module and the most important functions in each module are discuss...

Introduction: English language teaching curriculum is very important in effective teaching and learning of students. In order to pay attention to the importance of teaching English as one of the most important communication tools, it is necessary to develop a curriculum that can accommodate all the necessary English language teaching needs. Therefore, the purpose of this study is to analyze t...

2013
Peter Clifford Ioana Cosma

We consider the problem of approximating the empirical Shannon entropy of a highfrequency data stream under the relaxed strict-turnstile model, when space limitations make exact computation infeasible. An equivalent measure of entropy is the Rényi entropy that depends on a constant α. This quantity can be estimated efficiently and unbiasedly from a low-dimensional synopsis called an α-stable da...

2015
Guy Jumarie

By combining the explicit formula of the Shannon informational entropy ) , ( Y X H for two random variables X and Y , with the entropy )) ( ( X f H of ) (X f where (.) f is a real-valued differentiable function, we have shown that the density of the amount of information in Shannon sense involved in a non-random differentiable function is defined by the logarithm of the absolute value of its de...

Journal: :CoRR 2017
Maciej Skorski

The weak law of large numbers implies that, under mild assumptions on the source, the Renyi entropy per produced symbol converges (in probability) towards the Shannon entropy rate. This paper quantifies the speed of this convergence for sources with independent (but not iid) outputs, generalizing and improving the result of Holenstein and Renner (IEEE Trans. Inform. Theory, 2011). (a) we charac...

Journal: :Advances in Complex Systems 2002
Marcelo A. Montemurro Damián H. Zanette

Beyond the local constraints imposed by grammar, words concatenated in long sequences carrying a complex message show statistical regularities that may reflect their linguistic role in the message. In this paper, we perform a systematic statistical analysis of the use of words in literary English corpora. We show that there is a quantitative relation between the role of content words in literar...

Journal: :CoRR 2017
Shanyun Liu Rui She Jiaxun Lu Pingyi Fan

Shannon entropy is the most crucial foundation of Information Theory, which has been proven to be effective in many fields such as communications. Rényi entropy and Chernoff information are other two popular measures of information with wide applications. The mutual information is effective to measure the channel information for the fact that it reflects the relation between output variables an...

2015
Cafer Caferov Baris Kaya Ryan O'Donnell A. C. Cem Say

Let p be an unknown probability distribution on [n] := {1, 2, . . . n} that we can access via two kinds of queries: A SAMP query takes no input and returns x ∈ [n] with probability p[x]; a PMF query takes as input x ∈ [n] and returns the value p[x]. We consider the task of estimating the entropy of p to within ±∆ (with high probability). For the usual Shannon entropy H(p), we show that Ω(log n/...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید