نتایج جستجو برای: mutual information theory mi

تعداد نتایج: 1876105  

2008
Wen Zhang Taketoshi Yoshida Xijin Tang

As a sequence of two or more consecutive individual words inherent with contextual semantics of individual words, multi-word attracts much attention from statistical linguistics and of extensive applications in text mining. In this paper, we carried out a series studies on multi-word extraction from Chinese documents. Firstly, we proposed a new statistical method, augmented mutual information (...

Journal: :CoRR 2011
Paul M. B. Vitányi

We propose a compression-based version of the empirical entropy of a finite string over a finite alphabet. Whereas previously one considers the naked entropy of (possibly higher order) Markov processes, we consider the sum of the description of the random variable involved plus the entropy it induces. We assume only that the distribution involved is computable. To test the new notion we compare...

Journal: :Physical Review E 2004

2017
Chunyuan Li Hao Liu Changyou Chen Yunchen Pu Liqun Chen Ricardo Henao Lawrence Carin

Since our paper constrain correlation of two random variables using information theoretical measures, we first review the related concepts. For any probability measure π on the random variables x and z, we have the following additive and subtractive relationships for various information measures, including Mutual Information (MI), Variation of Information (VI) and the Conditional Entropy (CE). ...

2004
Thomas Eriksson Samuel Kim Hong-Goo Kang Chungyong Lee

In this paper, we develop theory for speaker recognition, based on information theory. We show that the performance of a speaker recognition system is closely connected to the mutual information between features and speaker, and derive upper and lower bounds for the performance. We apply the theory to the case when the speech is coded and transmitted over a packet-based channel, in which packet...

Journal: :Physical review 2021

We study the relationship between mixed state entanglement and thermal phase transitions. As a typical example, we compute holographic entropy (HEE), mutual information (MI) of purification (EoP) over superconductivity transition. find that HEE, MI EoP can all diagnose superconducting They are continuous at critical point, but their first derivative with respect to temperature is discontinuous....

Journal: :IEEE Transactions on Information Theory 2023

The asymptotic mutual information (MI) analysis for multiple-input multiple-output (MIMO) systems over double-scattering channels has achieved engaging results, but the convergence rates of mean, variance, and distribution MI are not yet available in literature. In this paper, by utilizing large random matrix theory (RMT), we give a central limit (CLT) derive closed-form approximation mean vari...

2012
ELkebir Sarhrouni Ahmed Hammouch Driss Aboutajdine

Remote sensing is a technology to acquire data for disatant substances, necessary to construct a model knowledge for applications as classification . Recently Hyperspectral Images (HSI) becomes a high technical tool that the main goal is to classify the point of a region. The HIS is more than a hundred bidirectional measures, called bands (or simply images), of the same region called Ground Tru...

2006
Girish Gopalakrishnan S. V. Bharath Kumar Rakesh Mullick Ajay Narayanan Srikanth Suryanarayanan

In this paper, we present a framework that one could use to set optimized parameter values, while performing image registration using mutual information as a metric to be maximized. Our experiment details these steps for the registration of X-ray Computer Tomography (CT) images with Positron Emission Tomography (PET) images. Selection of different parameters that influence the mutual informatio...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید