Entropy Versus Pairwise Independence
نویسنده
چکیده
We give lower bounds on the joint entropy of n pairwise independent random variables. We show that if the variables have no dominant value (their min-entropies are bounded away from zero) then this joint entropy grows as Ω(log n). This rate of growth is known to be best possible. If k-wise independence is assumed, we obtain an optimal Ω(k log n) lower bound for not too large k. We also show that the joint entropy of an arbitrary family of pairwise independent random variables grows as Ω(min(L, p log(2 + L)) where L is the sum of the entropies of the variables in the family. We expect that the √ log in this expression can be improved to log. We also prove a tight Ω(log log n) lower bound on the joint entropy of n balanced Bernoulli trials with pairwise correlation bounded away from 1.
منابع مشابه
Determination of weight vector by using a pairwise comparison matrix based on DEA and Shannon entropy
The relation between the analytic hierarchy process (AHP) and data envelopment analysis (DEA) is a topic of interest to researchers in this branch of applied mathematics. In this paper, we propose a linear programming model that generates a weight (priority) vector from a pairwise comparison matrix. In this method, which is referred to as the E-DEAHP method, we consider each row of the pairwise...
متن کاملInaccessible Entropy and its Applications
We summarize the constructions of PRGs from OWFs discussed so far and introduce the notion of inaccessible entropy [HILL99, HRVW09]. Remember that we are trying to construct objects that look random (PRGs) from an assumption about hardness of computation (OWFs). So far we have seen that it is possible to construct PRGs from OWFs if the OWF has some nice structural property. One-way Permutations...
متن کاملIndependent Component Analysis Over Galois Fields
We consider the framework of Independent Component Analysis (ICA) for the case where the independent sources and their linear mixtures all reside in a Galois field of prime order P . Similarities and differences from the classical ICA framework (over the Real field) are explored. We show that a necessary and sufficient identifiability condition is that none of the sources should have a Uniform ...
متن کاملConditional independence and natural conditional functions
The concept of conditional independence (CI) within the framework of natural conditional functions (NCFs) is studied. An NCF is a function ascribing natural numbers to possible states of the world; it is the central concept of Spohn's theory of deterministic epistemology. Basic properties of CI within this framework are recalled, and further results analogous to the results concerning probabili...
متن کاملCausality Discovery with Additive Disturbances: An Information-Theoretical Perspective
We consider causally sufficient acyclic causal models in which the relationship among the variables is nonlinear while disturbances have linear effects, and show that three principles, namely, the causal Markov condition (together with the independence between each disturbance and the corresponding parents), minimum disturbance entropy, and mutual independence of the disturbances, are equivalen...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2013