نتایج جستجو برای: information entropy

تعداد نتایج: 1203337  

Journal: :IEEE Trans. Information Theory 1993
Ram Zamir Meir Feder

We prove the following generalization of the Entropy Power Inequality: h(Ax) h(A~ x) where h() denotes (joint-) diierential-entropy, x = x 1 : : : x n is a random vector with independent components, ~ x = ~ x 1 : : : ~ x n is a Gaussian vector with independent components such that h(~ x i) = h(x i), i = 1 : : : n, and A is any matrix. This generalization of the entropy-power inequality is appli...

Journal: :Int. J. General Systems 2006
J. Liang Z. Shi D. Li Mark J. Wierman

†Key Laboratory of Ministry of Education for Computation Intelligence and Chinese Information Processing, School of Computer and Information Technology, Shanxi University, Taiyuan 030006, People’s Republic of China ‡Key Laboratory of Intelligent Information Processing, Institute of Computing Technology, The Chinese Academy of Sciences, Beijing 100080, People’s Republic of China {Creighton Unive...

1996
Roger Fawcett

Entropy coding is defined to be the compression of a stream of symbols taken from a known symbol set where the probability of occurrence of any symbol from the set at any given point in the stream is constant and independent of any known occurrences of any other symbols. Shannon and Fano showed that the information of such a sequence could be calculated. When measured in bits the information re...

2016
Yuta Sakai Ken-ichi Iwata

The paper examines relationships between the conditional Shannon entropy and the expectation of `α-norm for joint probability distributions. More precisely, we investigate the tight bounds of the expectation of `α-norm with a fixed conditional Shannon entropy, and vice versa. As applications of the results, we derive the tight bounds between the conditional Shannon entropy and several informati...

2014
Andreas Holzinger Matthias Hörtenhuber Christopher C. Mayer Martin Bachler Siegfried Wassertheurer Armando J. Pinho David Koslicki

In the real world, we are confronted not only with complex and high-dimensional data sets, but usually with noisy, incomplete and uncertain data, where the application of traditional methods of knowledge discovery and data mining always entail the danger of modeling artifacts. Originally, information entropy was introduced by Shannon (1949), as a measure of uncertainty in the data. But up to th...

A hybrid censoring scheme is a mixture of type I and type II censoring schemes. When $n$ items are placed on a life test, the experiment terminates under type I or type II hybrid censoring scheme if either a pre-fixed censoring time T or the rth (1<=r<=n&nbsp;is fixed) failure is first or later observed, respectively. In this paper, we investigate the decomposition of entropy in both hybrid cen...

Journal: :CoRR 2013
Jean-François Bercher

In this communication, we describe some interrelations between generalized q-entropies and a generalized version of Fisher information. In information theory, the de Bruijn identity links the Fisher information and the derivative of the entropy. We show that this identity can be extended to generalized versions of entropy and Fisher information. More precisely, a generalized Fisher information ...

Journal: :Information 2012
Christopher D. Fiorillo

It has been proposed that the general function of the brain is inference, which corresponds quantitatively to the minimization of uncertainty (or the maximization of information). However, there has been a lack of clarity about exactly what this means. Efforts to quantify information have been in agreement that it depends on probabilities (through Shannon entropy), but there has long been a dis...

Journal: :IEEE Trans. Information Theory 1998
Zhen Zhang Raymond W. Yeung

Given n discrete random variables = fX1; ; Xng, associated with any subset of f1; 2; ; ng, there is a joint entropy H(X ) where X = fXi: i 2 g. This can be viewed as a function defined on 2 2; ; ng taking values in [0; +1). We call this function the entropy function of . The nonnegativity of the joint entropies implies that this function is nonnegative; the nonnegativity of the conditional join...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید