نتایج جستجو برای: mutual information

تعداد نتایج: 1182120  

2015
Sergio Verdú

Rényi entropy and Rényi divergence evidence a long track record of usefulness in information theory and its applications. Alfred Rényi never got around to generalizing mutual information in a similar way. In fact, in the literature there are several possible ways to accomplish such generalization, most notably those suggested by Suguru Arimoto, Imre Csiszár, and Robin Sibson. We collect several...

2014
Yanis Linge Cécile Dumas Sophie Lambert-Lacroix

In the domain of the Side Channel Attacks, various statistical tools have succeeded to retrieve a secret key, as the Pearson coefficient or the Mutual Information. In this paper we propose to study the Maximal Information Coefficient (MIC) which is a non-parametric method introduced by Reshef et al. [13] to compare two random variables. The MIC is based on the mutual information but it is easie...

2009
Benjamin Van Durme Ashwin Lall

Recent work has led to the ability to perform space efficient, approximate counting over large vocabularies in a streaming context. Motivated by the existence of data structures of this type, we explore the computation of associativity scores, otherwise known as pointwise mutual information (PMI), in a streaming context. We give theoretical bounds showing the impracticality of perfect online PM...

2018
Fatih Cakir Kun He Sarah Adel Bargal Stan Sclaroff

Binary vector embeddings enable fast nearest neighbor retrieval in large databases of high-dimensional objects, and play an important role in many practical applications, such as image and video retrieval. We study the problem of learning binary vector embeddings under a supervised setting, also known as hashing. We propose a novel supervised hashing method based on optimizing an information-th...

2005
Michael Berens Zheng Zhao

As a theoretical basis of mRMR feature selection, we consider a more general feature-selection criterion, maximum dependency (MaxDep).1 In this case, we select the feature set Sm = {f1, f2, ..., fm}, of which the joint statistical distribution is maximally dependent on the distribution of the classification variable c. A convenient way to measure this statistical dependency is mutual information,

2004
Ori Mosenzon

Consider two random variables, X and Y . The mutual relation between the variables can vary between complete independence to complete dependency, when one variable is a deterministic function of the other. The measure of mutual information I(X ; Y ) quantifies the amount of dependency between X and Y , but states nothing about its nature. In this work we try to capture this dependency by using ...

M. Bilawal M. Dilawar Khan R. Yasir Hussain, U. Akmal

Mutual funds are the best tool to mobilize savings and investments in an economy and Pakistan is the pioneer in South Asia, but this industry is not as much mature in comparison to its age in Pakistan. This paper examines the performance of closed ended mutual funds in Pakistan by using five different ranking measures during a period of January 2009 to December 2013 and the sample consists of o...

2002
Huanxing Yang

We show that the free-riding problem in short-lived teams is not as severe as previously thought. Two critical conditions are: team members can observe each other’s effort periodically, which makes mutual monitoring possible; technology is convex (increasing marginal returns) or has a “completion benefit.” In principal-agent settings, mutual monitoring reduces the necessary wage payment to indu...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید