نتایج جستجو برای: mutual information
تعداد نتایج: 1182120 فیلتر نتایج به سال:
Mutual information I(X;Y) is a useful definition in theory to estimate how much the random variable Y holds about X. One way define mutual by comparing joint distribution of X and with product marginals through Kullback-Leibler (KL) divergence. If two distributions are close each other there will be almost no leakage from since variables being independent. In discrete setting has nice interpret...
We derive a well-defined renormalized version of mutual information that allows us to estimate the dependence between continuous random variables in important case when one is deterministically dependent on other. This situation relevant for feature extraction, where goal produce low-dimensional effective description high-dimensional system. Our approach enables discovery collective physical sy...
In the context of parameter estimation and model selection, it is only quite recently that a direct link between the Fisher information and information-theoretic quantities has been exhibited. We give an interpretation of this link within the standard framework of information theory. We show that in the context of population coding, the mutual information between the activity of a large array o...
We analyze the Stanford Natural Language Inference (SNLI) corpus in an investigation of bias and stereotyping in NLP data. The human-elicitation protocol employed in the construction of the SNLI makes it prone to amplifying bias and stereotypical associations, which we demonstrate statistically (using pointwise mutual information) and with qualitative examples.
Distributional semantic models, deriving vector-based word representations from patterns of word usage in corpora, have many useful applications (Turney and Pantel 2010). Recently, there has been interest in compositional distributional models, which derive vectors for phrases from representations of their constituent words (Mitchell and Lapata 2010). Often, the values of distributional vectors...
Originality, a key aspect of creativity, is difficult to measure. We tested the relationship between originality and similarity in two semantic spaces: latent semantic analysis (LSA) and pointwise mutual information (PMI). Similarity in both spaces was negatively correlated with human judgments of originality of responses on a test of divergent thinking. PMI was correlated more strongly both wi...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید