نتایج جستجو برای: point wise mutual information

تعداد نتایج: 1650254  

Journal: :British Journal of Mathematics & Computer Science 2014

2011
Pedro Latorre Carmona José Martínez Sotoca Filiberto Pla Frederick Kin Hing Phoa José M. Bioucas-Dias

This paper presents a supervised feature selection method applied to regression problems. The selection method uses a Dissimilarity matrix originally developed for classification problems, whose applicability is extended here to regression and built using the conditional mutual information between features with respect to a continuous relevant variable that represents the regression function. A...

2011
Nicola Di Mauro Floriana Esposito

This paper represents a further steps in combining Information Theory tools with relational learning. We show how mutual information can be used to find relevant relational features.

2015
Sergio Verdú

Rényi entropy and Rényi divergence evidence a long track record of usefulness in information theory and its applications. Alfred Rényi never got around to generalizing mutual information in a similar way. In fact, in the literature there are several possible ways to accomplish such generalization, most notably those suggested by Suguru Arimoto, Imre Csiszár, and Robin Sibson. We collect several...

2014
Yanis Linge Cécile Dumas Sophie Lambert-Lacroix

In the domain of the Side Channel Attacks, various statistical tools have succeeded to retrieve a secret key, as the Pearson coefficient or the Mutual Information. In this paper we propose to study the Maximal Information Coefficient (MIC) which is a non-parametric method introduced by Reshef et al. [13] to compare two random variables. The MIC is based on the mutual information but it is easie...

2009
Benjamin Van Durme Ashwin Lall

Recent work has led to the ability to perform space efficient, approximate counting over large vocabularies in a streaming context. Motivated by the existence of data structures of this type, we explore the computation of associativity scores, otherwise known as pointwise mutual information (PMI), in a streaming context. We give theoretical bounds showing the impracticality of perfect online PM...

2018
Fatih Cakir Kun He Sarah Adel Bargal Stan Sclaroff

Binary vector embeddings enable fast nearest neighbor retrieval in large databases of high-dimensional objects, and play an important role in many practical applications, such as image and video retrieval. We study the problem of learning binary vector embeddings under a supervised setting, also known as hashing. We propose a novel supervised hashing method based on optimizing an information-th...

2005
Michael Berens Zheng Zhao

As a theoretical basis of mRMR feature selection, we consider a more general feature-selection criterion, maximum dependency (MaxDep).1 In this case, we select the feature set Sm = {f1, f2, ..., fm}, of which the joint statistical distribution is maximally dependent on the distribution of the classification variable c. A convenient way to measure this statistical dependency is mutual information,

2004
Ori Mosenzon

Consider two random variables, X and Y . The mutual relation between the variables can vary between complete independence to complete dependency, when one variable is a deterministic function of the other. The measure of mutual information I(X ; Y ) quantifies the amount of dependency between X and Y , but states nothing about its nature. In this work we try to capture this dependency by using ...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید