نتایج جستجو برای: tsallis enropy
تعداد نتایج: 1401 فیلتر نتایج به سال:
We consider a small closed physical systems. It has xed energy and if the energy is a power of its generale coordinates then the distribution of any single coordinate will follow a distribution that maximizes Tsallis entropy. For many aspects of entropy can be discussed for such small systems with out going to the thermodynamical limit. By letting the number of degrees of freedom tend to in nit...
In this paper, we propose a new discriminative model named nonextensive information theoretical machine (NITM) based on nonextensive generalization of Shannon information theory. In NITM, weight parameters are treated as random variables. Tsallis divergence is used to regularize the distribution of weight parameters and maximum unnormalized Tsallis entropy distribution is used to evaluate fitti...
The two-dimensional (2-D) maximum Tsallis entropy method often gets ideal segmentation results, because it not only takes advantage of the spatial neighbor information with using the 2-D histogram of the image, but also has some flexibility with a parameter. However, its time-consuming computation is often an obstacle in real time application systems. In this paper, a fast image thresholding me...
The charged particle transverse momentum (pT ) spectra measured by the ATLAS and CMS collaborations for proton–proton collisions at √ s = 0.9 and 7 TeV have been studied using Tsallis thermodynamics. A thermodynamically consistent form of the Tsallis distribution is used for fitting the transverse momentum spectra at mid-rapidity. It is found that the fits based on the proposed distribution are...
when two systems are under consideration. In classical information theory, one employs the Kullback – Leibler relative entropy for this purpose which also has its quantum version. These are also additive measures and the Tsallis counterparts of these have been put forward and employed in the quantum context as well [10, 11]. There is promise in future work using the Tsallis approach to problems...
Shannon entropy of a probability measure P , defined as − ∫ X dP dμ ln dP dμ dμ on a measure space (X,M, μ), is not a natural extension from the discrete case. However, maximum entropy (ME) prescriptions of Shannon entropy functional in the measure-theoretic case are consistent with those for the discrete case. Also it is well known that Kullback-Leibler relative entropy can be extended natural...
We have calculated the Tsallis entropy and Fisher information matrix (entropy) of spatially correlated nonextensive systems, by using an analytic non-Gaussian distribution obtained by the maximum entropy method. The effects of the correlated variability on the Fisher information matrix are shown to be different from those on the Tsallis entropy. The Fisher information is increased (decreased) b...
Convexity is a key concept in information theory, namely via the many implications of Jensen’s inequality, such as the non-negativity of the Kullback-Leibler divergence (KLD). Jensen’s inequality also underlies the concept of Jensen-Shannon divergence (JSD), which is a symmetrized and smoothed version of the KLD. This paper introduces new JSD-type divergences, by extending its two building bloc...
We used magnetohydrodynamic (MHD) simulations of interstellar turbulence to study the probability distribution functions (PDFs) of increments of density, velocity, and magnetic field. We found that the PDFs are well described by a Tsallis distribution, following the same general trends found in solar wind and Electron MHD studies. We found that the PDFs of density are different in subsonic and ...
Shannon entropy [1] is one of fundamental quantities in classical information theory and uniquely determinded by the Shannon-Khinchin axiom or the Faddeev axiom. One-parameter extensions for Shannon entropy have been studied by many researchers. The Rényi entropy [2] and the Tsallis entropy [3] are famous. In the paper [4], the uniqueness theorem for the Tsallis entropy was proved. Also, in our...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید