Multivariate Dependence beyond Shannon Information

نویسندگان

  • Ryan G. James
  • James P. Crutchfield
چکیده

Accurately determining dependency structure is critical to discovering a system’s causal organization. We recently showed that the transfer entropy fails in a key aspect of this—measuring information flow—due to its conflation of dyadic and polyadic relationships. We extend this observation to demonstrate that this is true of all such Shannon information measures when used to analyze multivariate dependencies. This has broad implications, particularly when employing information to express the organization and mechanisms embedded in complex systems, including the burgeoning efforts to combine complex network theory with information theory. Here, we do not suggest that any aspect of information theory is wrong. Rather, the vast majority of its informational measures are simply inadequate for determining the meaningful dependency structure within joint probability distributions. Therefore, such information measures are inadequate for discovering intrinsic causal relations. We close by demonstrating that such distributions exist across an arbitrary set of variables.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Shannon Entropy and Mutual Information for Multivariate SkewElliptical Distributions

The entropy and mutual information index are important concepts developed by Shannon in the context of information theory. They have been widely studied in the case of the multivariate normal distribution. We first extend these tools to the full symmetric class of multivariate elliptical distributions and then to the more flexible families of multivariate skew-elliptical distributions. We study...

متن کامل

The copula approach to characterizing dependence structure in neural populations.

The question as to the role that correlated activity plays in the coding of information in the brain continues to be one of the most important in neuroscience. One approach to understanding this role is to formally model the ensemble responses as multivariate probability distributions. We have previously introduced alternatives to linear assumptions of multivariate Gaussian dependence for spike...

متن کامل

Dynamic Bayesian Information Measures

This paper introduces measures of information for Bayesian analysis when the support of data distribution is truncated progressively. The focus is on the lifetime distributions where the support is truncated at the current age t>=0. Notions of uncertainty and information are presented and operationalized by Shannon entropy, Kullback-Leibler information, and mutual information. Dynamic updatings...

متن کامل

Unique Additive Information Measures- Boltzmann-gibbs-shannon, Fisher and Beyond

It is proved that the only additive and isotropic information measure that can depend on the probability distribution and also on its first derivative is a linear combination of the Boltzmann-Gibbs-Shannon and Fisher information measures. Further possibilities are investigated, too.

متن کامل

Non-Gaussian interaction information: estimation, optimization and diagnostic application of triadic wave resonance

Non-Gaussian multivariate probability distributions, derived from climate and geofluid statistics, allow for nonlinear correlations between linearly uncorrelated components, due to joint Shannon negentropies. Triadic statistical dependence under pair-wise (total or partial) independence is thus possible. Synergy or interaction information among triads is estimated. We formulate an optimization ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • Entropy

دوره 19  شماره 

صفحات  -

تاریخ انتشار 2017