نتایج جستجو برای: point wise mutual information
تعداد نتایج: 1650254 فیلتر نتایج به سال:
This paper presents a supervised feature selection method applied to regression problems. The selection method uses a Dissimilarity matrix originally developed for classification problems, whose applicability is extended here to regression and built using the conditional mutual information between features with respect to a continuous relevant variable that represents the regression function. A...
This paper represents a further steps in combining Information Theory tools with relational learning. We show how mutual information can be used to find relevant relational features.
Rényi entropy and Rényi divergence evidence a long track record of usefulness in information theory and its applications. Alfred Rényi never got around to generalizing mutual information in a similar way. In fact, in the literature there are several possible ways to accomplish such generalization, most notably those suggested by Suguru Arimoto, Imre Csiszár, and Robin Sibson. We collect several...
In the domain of the Side Channel Attacks, various statistical tools have succeeded to retrieve a secret key, as the Pearson coefficient or the Mutual Information. In this paper we propose to study the Maximal Information Coefficient (MIC) which is a non-parametric method introduced by Reshef et al. [13] to compare two random variables. The MIC is based on the mutual information but it is easie...
Recent work has led to the ability to perform space efficient, approximate counting over large vocabularies in a streaming context. Motivated by the existence of data structures of this type, we explore the computation of associativity scores, otherwise known as pointwise mutual information (PMI), in a streaming context. We give theoretical bounds showing the impracticality of perfect online PM...
Binary vector embeddings enable fast nearest neighbor retrieval in large databases of high-dimensional objects, and play an important role in many practical applications, such as image and video retrieval. We study the problem of learning binary vector embeddings under a supervised setting, also known as hashing. We propose a novel supervised hashing method based on optimizing an information-th...
As a theoretical basis of mRMR feature selection, we consider a more general feature-selection criterion, maximum dependency (MaxDep).1 In this case, we select the feature set Sm = {f1, f2, ..., fm}, of which the joint statistical distribution is maximally dependent on the distribution of the classification variable c. A convenient way to measure this statistical dependency is mutual information,
Consider two random variables, X and Y . The mutual relation between the variables can vary between complete independence to complete dependency, when one variable is a deterministic function of the other. The measure of mutual information I(X ; Y ) quantifies the amount of dependency between X and Y , but states nothing about its nature. In this work we try to capture this dependency by using ...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید