نتایج جستجو برای: shannon entropy numerical simulation

تعداد نتایج: 876050  

2003
Winston Wang

The flow fields in polymer processing exhibit complex behavior with chaotic characteristics, due in part to the non-linearity of the field equations describing them. In chaotic flows fluid elements are highly sensitive to their initial positions and velocities. A fundamental understanding of such characteristics is essential for optimization and design of equipment used for distributive mixing....

Selecting appropriate inputs for intelligent models is important due to reduce costs and save time and increase accuracy and efficiency of models. The purpose of this study is using Shannon entropy to select the optimum combination of input variables in time series modeling. Monthly time series of precipitation, temperature and radiation in the period of 1982-2010 was used from Tabriz synoptic ...

Journal: :CoRR 2007
Alexander I. Aptekarev Jesús Sánchez-Dehesa Andrei Martínez-Finkelshtein R. J. Yáñez

Let pn, n ∈ N, be the nth orthonormal polynomial on R, whose zeros are λ j , j = 1, . . . , n. Then for each j = 1, . . . , n, ~ Ψj def = ( Ψ1j , . . . ,Ψ 2 nj ) with Ψij = p 2 i−1(λ (n) j ) ( n−1 ∑ k=0 pk(λ (n) j ) ) −1 , i = 1, . . . , n, defines a discrete probability distribution. The Shannon entropy of the sequence {pn} is consequently defined as Sn,j def = − n ∑ i=1 Ψij log ( Ψij ) . In t...

Journal: :Open Syst. Inform. Dynam. 2008
Grzegorz Haranczyk Wojciech Slomczynski Tomasz Zastawniak

The notion of utility maximising entropy (u-entropy) of a probability density, which was introduced and studied in [SZ04], is extended in two directions. First, the relative u-entropy of two probability measures in arbitrary probability spaces is defined. Then, specialising to discrete probability spaces, we also introduce the absolute u-entropy of a probability measure. Both notions are based ...

Golestanian, Heidari, Homaei,

A new method based on principal component analysis (PCA) and artificial neural networks (ANN) is proposed for fault diagnosis of gearboxes. Firstly the six different base wavelets are considered, in which three are from real valued and other three from complex valued. Two wavelet selection criteria Maximum Energy to Shannon Entropy ratio and Maximum Relative Wavelet Energy are used and compared...

Journal: :Entropy 2013
Rudolf Hanel Stefan Thurner

Complex systems are often inherently non-ergodic and non-Markovian and Shannon entropy loses its applicability. Accelerating, path-dependent and aging random walks offer an intuitive picture for non-ergodic and non-Markovian systems. It was shown that the entropy of non-ergodic systems can still be derived from three of the Shannon–Khinchin axioms and by violating the fourth, the so-called comp...

In this paper, at first we derive a family of maximum Tsallis entropy distributions under optional side conditions on the mean income and the Gini index. Furthermore, corresponding with these distributions a family of Lorenz curves compatible with the optional side conditions is generated. Meanwhile, we show that our results reduce to Shannon entropy as $beta$ tends to one. Finally, by using ac...

2007
Ambedkar Dukkipati Shalabh Bhatnagar M. Narasimha Murty

Shannon entropy of a probability measure P , defined as − ∫ X dP dμ ln dP dμ dμ on a measure space (X,M, μ), is not a natural extension from the discrete case. However, maximum entropy (ME) prescriptions of Shannon entropy functional in the measure-theoretic case are consistent with those for the discrete case. Also it is well known that Kullback-Leibler relative entropy can be extended natural...

Journal: :CoRR 2018
Chun-Wang Ma Yu-Gang Ma

The general idea of information entropy provided by C.E. Shannon “hangs over everything we do” and can be applied to a great variety of problems once the connection between a distribution and the quantities of interest is found. The Shannon information entropy essentially quantify the information of a quantity with its specific distribution, for which the information entropy based methods have ...

Journal: :CoRR 2011
John Scoville

A new approach to data compression is developed and applied to multimedia content. This method separates messages into components suitable for both lossless coding and ’lossy’ or statistical coding techniques, compressing complex objects by separately encoding signals and noise. This is demonstrated by compressing the most significant bits of data exactly, since they are typically redundant and...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید