نتایج جستجو برای: including shannon

تعداد نتایج: 981379  

Journal: :CoRR 2013
Kazuho Watanabe

Direct evaluation of the rate-distortion function has rarely been achieved when it is strictly greater than its Shannon lower bound. In this paper, we consider the rate-distortion function for the distortion measure defined by an ε-insensitive loss function. We first present the Shannon lower bound applicable to any source distribution with finite differential entropy. Then, focusing on the Lap...

Journal: :CoRR 2008
Shigeru Furuichi

Shannon entropy [1] is one of fundamental quantities in classical information theory and uniquely determinded by the Shannon-Khinchin axiom or the Faddeev axiom. Oneparameter extensions for Shannon entropy have been studied by many researchers [2]. The Rényi entropy [3] and the Tsallis entropy [4] are famous. In the paper [5], the uniqueness theorem for the Tsallis entropy was proven. See also ...

2016
Yuta Sakai Ken-ichi Iwata

The paper examines relationships between the Shannon entropy and the `α-norm for n-ary probability vectors, n ≥ 2. More precisely, we investigate the tight bounds of the `α-norm with a fixed Shannon entropy, and vice versa. As applications of the results, we derive the tight bounds between the Shannon entropy and several information measures which are determined by the `α-norm. Moreover, we app...

2007
John Watrous

1.2 REMARK ON INTERPRETATIONS OF THE SHANNON ENTROPY There are standard ways to interpret the Shannon entropy. For instance, the quantity H(p) can be viewed as a measure of the amount of uncertainty in a random experiment described by the probability mass function p, or as a measure of the amount of information one gains by learning the value of such an experiment. Indeed, it is possible to sta...

Journal: :Combinatorica 1998
Noga Alon

For an undirected graph G = (V,E), let G denote the graph whose vertex set is V n in which two distinct vertices (u1, u2, . . . , un) and (v1, v2, . . . , vn) are adjacent iff for all i between 1 and n either ui = vi or uivi ∈ E. The Shannon capacity c(G) of G is the limit limn→∞(α(G)), where α(G) is the maximum size of an independent set of vertices in G. We show that there are graphs G and H ...

2008
Piotr Garbaczewski

Shannon information entropy is a natural measure of probability (de)localization and thus (un)predictability in various procedures of data analysis for model systems. We pay particular attention to links between the Shannon entropy and the related Fisher information notion, which jointly account for the shape and extension of continuous probability distributions. Classical, dynamical and random...

2003
Masanori Ohya

It is von Neumann who opened the window for today’s Information epoch. He defined quantum entropy including Shannon’s information more than 20 years ahead of Shannon, and he introduced a concept what computation means mathematically. In this paper I will report two works that we have recently done, one of which is on quantum algorithum in generalized sense solvimg the SAT problem (one of NP com...

Journal: :journal of medical signals and sensors 0
saeedeh mobasheri hamid behnam parisa rangraz jahan tavakkoli

high‑intensity focused ultrasound (hifu) is a novel treatment modality used by scientists and clinicians in the recent decades. this modality has had a great and significant success as a noninvasive surgery technique applicable in tissue ablation therapy and cancer treatment. in this study, radio frequency (rf) ultrasound signals were acquired and registered in three stages of before, during, a...

Journal: :بوم شناسی کشاورزی 0
سعید پاک طینت سئیج حسین صادقی نامقی مجتبی حسینی سعید هاتفی

the predatory mites of suborder prostigmata are important natural enemies of spider mites. in this study abundance and biodiversity of predatory mites of super families of raphignathoidea, bdelloidea and erythraeoidea in pomefruit orchards of mashhad) educational orchard of agricultural college of ferdowsi unuversity of mashhad, agricultural research centre of torogh and laeen, torghabe and sha...

2011
Yihong Wu

Compressed sensing is a signal processing technique to encode analog sources by real numbers rather than bits, dealing with efficient recovery of a real vector from the information provided by linear measurements. By leveraging the prior knowledge of the signal structure (e.g., sparsity) and designing efficient non-linear reconstruction algorithms, effective compression is achieved by taking a ...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید