نتایج جستجو برای: shannon capacity

تعداد نتایج: 286802  

Journal: :CoRR 2013
M. A. Sorokina S. K. Turitsyn

We compute Shannon capacity of nonlinear channels with regenerative elements. Conditions are found under which capacity of such nonlinear channels is higher than the Shannon capacity of the classical linear additive white Gaussian noise channel. We develop a general scheme for designing the proposed channels and apply it to the particular nonlinear sine-mapping. The upper bound for regeneration...

Journal: :Electr. J. Comb. 2013
Ashik Mathew Kizhakkepallathu Patric R. J. Östergård Alexandru Popa

The Shannon capacity of a graph G is c(G) = supd>1(α(G d)) 1 d , where α(G) is the independence number of G. The Shannon capacity of the Kneser graph KGn,r was determined by Lovász in 1979, but little is known about the Shannon capacity of the complement of that graph when r does not divide n. The complement of the Kneser graph, KGn,2, is also called the triangular graph Tn. The graph Tn has th...

Journal: :Nature communications 2014
M A Sorokina S K Turitsyn

Since Shannon derived the seminal formula for the capacity of the additive linear white Gaussian noise channel, it has commonly been interpreted as the ultimate limit of error-free information transmission rate. However, the capacity above the corresponding linear channel limit can be achieved when noise is suppressed using nonlinear elements; that is, the regenerative function not available in...

Journal: :IEEE Trans. Information Theory 1979
László Lovász

A/Mmcr-It is proved that the Shannon zero-error capacity of the pentagon is e. The method is then generalized to obtain upper bounds on the capacity of au arbitrary graph. A well-characterized, and in a sense easily computable, function is introduced which bounds the capacity from above and equals the capacity in a large number of cases. Several results are obtained on the capacity of special g...

2008
Adam B. Barrett M.C.W. van Rossum

X iv :0 80 3. 19 23 v1 [ qbi o. N C ] 1 3 M ar 2 00 8 Shannon Information Capa ity of Dis rete Synapses Adam B. Barrett and M.C.W. van Rossum Institute for Adaptive and Neural Computation University of Edinburgh, 5 Forrest Hill Edinburgh EH1 2QL, UK There is eviden e that biologi al synapses have only a xed number of dis rete weight states. Memory storage with su h synapses behaves quite di ere...

Journal: :Combinatorica 1998
Noga Alon

For an undirected graph G = (V,E), let G denote the graph whose vertex set is V n in which two distinct vertices (u1, u2, . . . , un) and (v1, v2, . . . , vn) are adjacent iff for all i between 1 and n either ui = vi or uivi ∈ E. The Shannon capacity c(G) of G is the limit limn→∞(α(G)), where α(G) is the maximum size of an independent set of vertices in G. We show that there are graphs G and H ...

2004
G. J. Foschini J. M. Kahn A. L. Moustakas L. Balents A. L. Moustakas. S. H. Simon

Using the fact that the Shannon capacity C of a Raleigh model of wireless channels is a linear statistic of the channel matrix, we calculate its variance var[C]. We find that the expected value C of the Shannon capacity is typical in the model considered, that is the coefficient of variation var[C]/C is small. The efficiency of a wireless channel is determined by its Shannon capacity, C = log 2...

Journal: :Periodica Mathematica Hungarica 2021

Abstract A symmetric variant of the Shannon capacity graphs is defined and computed.

Journal: :CoRR 2015
Jon Montalban Jon Barrueco Pablo Angueira Jerrold D. Prothero

The Shannon upper bound places a limit on the error-free information transmission rate (capacity) of a noisy channel. It has stood for over sixty years, and underlies both theoretical and practical work in the telecommunications industry. This upper bound arises from the Shannon-Hartley law, which has two parameters: the available bandwidth and the signal-to-noise power ratio. However, aside fr...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید