نتایج جستجو برای: information entropy

تعداد نتایج: 1203337  

Journal: :Int. J. Semantic Computing 2013
David Ellerman

The logical basis for information theory is the newly developed logic of partitions that is dual to the usual Boolean logic of subsets. The key concept is a "distinction" of a partition, an ordered pair of elements in distinct blocks of the partition. The logical concept of entropy based on partition logic is the normalized counting measure of the set of distinctions of a partition on a finite ...

Journal: :Trans. Rough Sets 2009
Daniela Bianucci Gianpiero Cattaneo

Some approaches to the covering information entropy and some definitions of orderings and quasi–orderings of coverings will be described, generalizing the case of the partition entropy and ordering. The aim is to extend to covering the general result of anti–tonicity (strictly decreasing monotonicity) of partition entropy. In particular an entropy in the case of incomplete information systems i...

Journal: :CoRR 2009
Fabio G. Guerrero

A simple method for finding the entropy and redundancy of a reasonable long sample of English text by direct computer processing and from first principles according to Shannon theory is presented. As an example, results on the entropy of the English language have been obtained based on a total of 20.3 million characters of written English, considering symbols from one to five hundred characters...

Journal: :CoRR 2013
David Ellerman

The logical basis for information theory is the newly developed logic of partitions that is dual to the usual Boolean logic of subsets. The key concept is a "distinction" of a partition, an ordered pair of elements in distinct blocks of the partition. The logical concept of entropy based on partition logic is the normalized counting measure of the set of distinctions of a partition on a finite ...

2000
M. S. GŁOWACKI

In the contemporary world the information theory is more and more widerly used, that is why its elements should be included in the physics education on higher level, especially as far as the connection between this theory and physics measurements is concerned.The following paper consists of the methodological propositions of those elements.The information theory equations presented in the first...

Journal: :CoRR 2008
Jian Ma Zengqi Sun

In information theory, mutual information (MI) is a difference concept with entropy.[1] In this paper, we prove with copula [2] that they are essentially same – mutual information is also a kind of entropy, called copula entropy. Based on this insightful result, We propose a simple method for estimating mutual information. Copula is a theory on dependence and measurement of association.[2] Skla...

Journal: :Entropy 2011
John C. Baez Tobias Fritz Tom Leinster

There are numerous characterizations of Shannon entropy and Tsallis entropy as measures of information obeying certain properties. Using work by Faddeev and Furuichi, we derive a very simple characterization. Instead of focusing on the entropy of a probability measure on a finite set, this characterization focuses on the “information loss”, or change in entropy, associated with a measure-preser...

Journal: :Entropy 2013
Jianbo Gao Feiyan Liu Jianfang Zhang Jing Hu Yinhe Cao

What is information? What role does information entropy play in this information exploding age, especially in understanding emergent behaviors of complex systems? To answer these questions, we discuss the origin of information entropy, the difference between information entropy and thermodynamic entropy, the role of information entropy in complexity theories, including chaos theory and fractal ...

2012
Yong Wang Huadeng Wang Qiong Cao

The limitations of Shannon information theory are pointed out from new perspectives. The limitations mainly exist in the neglects of the information reliability and completeness. The significances of the information reliability to the information measurements are further illustrated through example analysis. It is pointed out that such limitations originate from neglects of multilevel informati...

1992

Neural encoding and decoding focus on the question: " What does the response of a neuron tell us about a stimulus ". In this chapter we consider a related but different question: " How much does the neural response tell us about a stimulus ". The techniques of information theory allow us to answer this question in a quantitative manner. Furthermore, we can use them to ask what forms of neural r...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید