نتایج جستجو برای: mutual information theory mi

تعداد نتایج: 1876105  

Journal: :Entropy 2013
Masashi Sugiyama

Mutual information (MI) is useful for detecting statistical independence between random variables, and it has been successfully applied to solving various machine learning problems. Recently, an alternative to MI called squared-loss MI (SMI) was introduced. While ordinary MI is the Kullback–Leibler divergence from the joint distribution to the product of the marginal distributions, SMI is its P...

Journal: :Entropy 2014
Henrique Tomaz Amaral-Silva Lauro Wichert-Ana Luiz Otávio Murta Larissa Romualdo-Suzuki Emerson Itikawa Geraldo Filho Bussato Paulo Mazzoncini de Azevedo Marques

Neuroimage registration has an important role in clinical (for both diagnostic and therapeutic purposes) and research applications. In this article we describe the applicability of Tsallis Entropy as a new cost function for neuroimage registration through a comparative analysis based on the performance of the traditional approaches (correlation based: Entropy Correlation Coefficient (ECC) and N...

Journal: :journal of medical signals and sensors 0

medical image registration methods which use mutual information as similarity measure have been improved in recent decades. mutual information is a basic concept of information theory which indicates the dependency of two random variables (or two images). in order to evaluate the mutual information of two images their joint probability distribution is required. several interpolation methods, su...

Journal: :Proceedings of the National Academy of Sciences of the United States of America 2016
Juan Zhao Yiwei Zhou Xiujun Zhang Luonan Chen

Quantitatively identifying direct dependencies between variables is an important task in data analysis, in particular for reconstructing various types of networks and causal relations in science and engineering. One of the most widely used criteria is partial correlation, but it can only measure linearly direct association and miss nonlinear associations. However, based on conditional independe...

Journal: :Neuroscience letters 2007
Mandar S Jog Dorian Aur Christopher I Connolly

Learning is important for humans and can be disrupted by disease. However, the essence of how learning may be represented within a neuronal network is still elusive. Spike trains generated by neurons have been demonstrated to carry information which is relevant for learning. The present study uses well-established mutual information (MI) analysis techniques to better understand learning within ...

2005
Hongxing He Huidong Jin Jie Chen

For classification of health data, we propose in this paper a fast and accurate feature selection method, FIEBIT (Feature Inclusion and Exclusion Based on Information Theory). FIEBIT selects the most relevant and non-redundant features using Conditional Mutual Information (CMU) while excluding irrelevant and redundant features according to the comparison among Individual Symmetrical Uncertainty...

Journal: :J. Phonetics 2012
Martijn Wieling Eliza Margaretha John Nerbonne

Structuralists famously observed that language is ”un systême oû tout se tient” (Meillet, 1903, p. 407), insisting that the system of relations of linguistic units was more important than their concrete content. This study attempts to derive content from relations, in particular phonetic (acoustic) content from the distribution of alternative pronunciations used in different geographical variet...

Journal: :Entropy 2013
Víctor Serrano-Solís Marco V. José

The hypothesis that Mutual Information (MI) dendrograms of influenza A viruses reflect informational groups generated during viral evolutionary processes is put forward. Phylogenetic reconstructions are used for guidance and validation of MI dendrograms. It is found that MI profiles display an oscillatory behavior for each of the eight RNA segments of influenza A. It is shown that dendrograms o...

2013
Uwe D. Reichel

Established phonological theories postulate uniform syllable constituent structures. From a traditional hierarchical point of view, syllables are right branching implying a close connection between the nucleus and the coda. Articulatory Phonology in contrast suggests a stronger cohesion between onsets and nuclei than between nuclei and codas. This claim is empirically supported by the c-center ...

Journal: :Processes 2022

Mutual information (MI) has been widely used for association mining in complex chemical processes, but how to precisely estimate MI between variables of different numerical types, discriminate their relationships with targets and finally achieve compact interpretable prediction not discussed detail, which may limit more complicated industrial applications. Therefore, this paper first reviews th...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید