نتایج جستجو برای: information entropy

تعداد نتایج: 1203337  

In this paper at rst, a history of mathematical models is given.Next, some basic information about random variables, stochastic processesand Markov chains is introduced. As follows, the entropy for a discrete timeMarkov process is mentioned. After that, the entropy for SIS stochastic modelsis computed, and it is proved that an epidemic will be disappeared after a longtime.

In the present study, an analytical investigation on the entropy generation examination for viscoelastic fluid flow involving inclined magnetic field and non-linear thermal radiation aspects with the heat source and sink over a stretching sheet has been done. The boundary layer governing partial differential equations were converted in terms of appropriate similarity transformations to non-line...

The regional evaluation of monitoring stations for water resources can be of great importance due to its role in finding appropriate locations for stations, the maximum gathering of useful information and preventing the accumulation of unnecessary information and ultimately reducing the cost of data collection. Based on the theory of discrete entropy, this study analyzes the density of rain gag...

Journal: :Entropy 2017
Jung In Seo Yongku Kim

Abstract: In this paper, we provide an entropy inference method that is based on an objective Bayesian approach for upper record values having a two-parameter logistic distribution. We derive the entropy that is based on the i-th upper record value and the joint entropy that is based on the upper record values. Moreover, we examine their properties. For objective Bayesian analysis, we obtain ob...

Journal: :Inf. Sci. 2003
Luc Knockaert

Rényi entropies are compared to generalized log-Fisher information and variational entropies in the context of translation, scale and concentration invariance. It is proved that the Rényi entropies occupy a special place amongst these entropies. It is also shown that Shannon entropy is centrally positioned amidst the Rényi entropies.

2014
Yi-Ling Chung Chung-Ching Wang Hsueh-Chih Chen Jon-Fan Hu

One of the obstacles to fully ensure the semantic contents of words is how to grab the meanings of a word from various probabilities it associates with other words. According to Shannon’s (1948) information theory, entropy can provide indications of amount of information and extent of uncertainty of a given variable by calculating the probability distributions of event occurrence. Therefore, en...

Journal: :Physical review. E, Statistical, nonlinear, and soft matter physics 2002
Velimir M. Ilic Miomir S. Stankovic

The form invariance of pseudoadditivity is shown to determine the structure of nonextensive entropies. Nonextensive entropy is defined as the appropriate expectation value of nonextensive information content, similar to the definition of Shannon entropy. Information content in a nonextensive system is obtained uniquely from generalized axioms by replacing the usual additivity with pseudoadditiv...

Journal: :Entropy 2016
Dayi He Jiaqiang Xu Xiaoling Chen

Weight aggregation is the key process to solve a multiple-attribute group decision-making (MAGDM) problem. This paper is trying to propose a possible approach to objectivize subjective information and to aggregate information from attribute values themselves and decision-makers’ judgment. An MAGDM problem without information about decision-makers’ and attributes’ weight is considered. In order ...

Journal: :CoRR 2015
Giancarlo Pastor Inmaculada Mora-Jim'enez Riku Jantti Antonio J. Caamano

Sparsity and entropy are pillar notions of modern theories in signal processing and information theory. However, there is no clear consensus among scientists on the characterization of these notions. Previous efforts have contributed to understand individually sparsity or entropy from specific research interests. This paper proposes a mathematical formalism, a joint axiomatic characterization, ...

Journal: :Entropy 2008
Imre Csiszár

Axiomatic characterizations of Shannon entropy, Kullback I-divergence, and some generalized information measures are surveyed. Three directions are treated: (A) Characterization of functions of probability distributions suitable as information measures. (B) Characterization of set functions on the subsets of {1, . . . , N} representable by joint entropies of components of an N -dimensional rand...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید