نتایج جستجو برای: shannon

تعداد نتایج: 9904  

2007
Hussnain Ali Talha J. Ahmad Shoab A. Khan

Various segmentation algorithms have been proposed for better classification of the highly nonstationary heart sounds. This paper proposes an improved segmentation technique based on Shannon Energy calculation of the phonocardiogram using adaptive windows. The major focus of the research has been on a simple yet comprehensive signal representation as well as on extracting most information from ...

Journal: :IEEE Trans. Information Theory 1994
Tamás Linder Ram Zamir

New results are proved on the convergence of the Shannon lower bound to the rate distortion function as the distortion decreases to zero. The key convergence result is proved using a fundamental property of informational divergence. As a corollary, it is shown that the Shannon lower bound is asymptotically tight for norm-based distortions, when the source vector has a finite differential entrop...

Journal: :Des. Codes Cryptography 2017
K. Ashik Mathew Patric R. J. Östergård

The Shannon capacity of a graphG is defined as c(G) = supd≥1(α(G )) 1 d , where α(G) is the independence number of G. The Shannon capacity of the cycle C5 on 5 vertices was determined by Lovász in 1979, but the Shannon capacity of a cycle Cp for general odd p remains one of the most notorious open problems in information theory. By prescribing stabilizers for the independent sets in C p and usi...

Journal: :CoRR 2017
Shanyun Liu Rui She Jiaxun Lu Pingyi Fan

Shannon entropy is the most crucial foundation of Information Theory, which has been proven to be effective in many fields such as communications. Rényi entropy and Chernoff information are other two popular measures of information with wide applications. The mutual information is effective to measure the channel information for the fact that it reflects the relation between output variables an...

Journal: :American journal of physiology. Renal physiology 2004
James A Schafer

This essay looks at the historical significance of four APS classic papers that are freely available online: Jolliffe N, Shannon JA, and Smith HW. The excretion of urine in the dog. III. The use of non-metabolized sugars in the measurement of the glomerular filtrate. Am J Physiol 100: 301-312, 1932 (http://ajplegacy.physiology.org/cgi/reprint/100/2/301). Shannon JA. The excretion of inulin by t...

Journal: :CoRR 2017
Vasile Patrascu

Shannon entropy was defined for probability distributions and then its using was expanded to measure the uncertainty of knowledge for systems with complete information. In this article, it is proposed to extend the using of Shannon entropy to under-defined or over-defined information systems. To be able to use Shannon entropy, the information is normalized by an affine transformation. The const...

2003
Robert L. Benedetto ROBERT L. BENEDETTO

It is well known that the Haar and Shannon wavelets in L2(R) are at opposite extremes, in the sense that the Haar wavelet is localized in time but not in frequency, whereas the Shannon wavelet is localized in freqency but not in time. We present a rich setting where the Haar and Shannon wavelets coincide and are localized both in time and in frequency. More generally, if R is replaced by a grou...

Journal: :CoRR 2007
Brooke Shrader Anthony Ephremides

We study and compare the Shannon capacity region and the stable throughput region for a random access system in which source nodes multicast their messages to multiple destination nodes. Under an erasure channel model which accounts for interference and allows for multipacket reception, we first characterize the Shannon capacity region. We then consider a queueing-theoretic formulation and char...

2013
Ian Scott MacKenzie

The three most common variations of Fitts’ index of difficulty are the Fitts formulation, the Welford formulation, and the Shannon formulation. A recent paper by Hoffmann [1] critiqued the three and concluded that the Fitts and Welford formulations are valid and that the Shannon formulation is invalid. In this paper, we challenge Hoffmann’s position regarding the Shannon formulation. It is argu...

2009
Travis Gagie Yakov Nekrich

A common complaint about adaptive prefix coding is that it is much slower than static prefix coding. Karpinski and Nekrich recently took an important step towards resolving this: they gave an adaptive Shannon coding algorithm that encodes each character in O(1) amortized time and decodes it in O(logH) amortized time, where H is the empirical entropy of the input string s. For comparison, Gagie’...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید