نتایج جستجو برای: tsallis relative operator entropy

تعداد نتایج: 540977  

2008
A. Plastino O. A. Rosso

when two systems are under consideration. In classical information theory, one employs the Kullback – Leibler relative entropy for this purpose which also has its quantum version. These are also additive measures and the Tsallis counterparts of these have been put forward and employed in the quantum context as well [10, 11]. There is promise in future work using the Tsallis approach to problems...

2004
Jan Naudts

The present paper studies continuity of generalized entropy functions and relative entropies defined using the notion of a deformed logarithmic function. In particular, two distinct definitions of relative entropy are discussed. As an application, all considered entropies are shown to satisfy Lesche’s stability condition. The entropies of Tsallis’ nonextensive thermostatistics are taken as exam...

Journal: :Entropy 2010
Amir Hossein Darooneh Ghassem Naeimi Ali Mehri Parvin Sadeghi

Non-extensive statistical mechanics appears as a powerful way to describe complex systems. Tsallis entropy, the main core of this theory has been remained as an unproven assumption. Many people have tried to derive the Tsallis entropy axiomatically. Here we follow the work of Wang (EPJB, 2002) and use the incomplete information theory to retrieve the Tsallis entropy. We change the incomplete in...

Journal: :Axioms 2017
Sonja Jäckle Karsten Keller

The Tsallis entropy given for a positive parameter α can be considered as a generalization of the classical Shannon entropy. For the latter, corresponding to α = 1, there exist many axiomatic characterizations. One of them based on the well-known Khinchin-Shannon axioms has been simplified several times and adapted to Tsallis entropy, where the axiom of (generalized) Shannon additivity is playi...

By utilizing different scalar equalities obtained via Hermite's interpolating polynomial, we will obtain lower and upper bounds for the difference in Ando's inequality and in the Edmundson-Lah-Ribariv c inequality for solidarities that hold for a class of $n$-convex functions. As an application, main results are applied to some operator means and relative operator entropy.

Journal: :CoRR 2006
Ambedkar Dukkipati M. Narasimha Murty Shalabh Bhatnagar

X dP dμ ln dP dμ dμ on a measure space (X,M, μ), does not qualify itself as an information measure (it is not a natural extension of the discrete case), maximum entropy (ME) prescriptions in the measure-theoretic case are consistent with that of discrete case. In this paper, we study the measure-theoretic definitions of generalized information measures and discuss the ME prescriptions. We prese...

2014
Miguel A. Ré Rajeev K. Azad

Entropy based measures have been frequently used in symbolic sequence analysis. A symmetrized and smoothed form of Kullback-Leibler divergence or relative entropy, the Jensen-Shannon divergence (JSD), is of particular interest because of its sharing properties with families of other divergence measures and its interpretability in different domains including statistical physics, information theo...

2015
Leila Golshani

In this paper, we show that in order to obtain the Tsallis entropy rate for stochastic processes, we can use the limit of conditional entropy, as it was done for the case of Shannon and Renyi entropy rates. Using that we can obtain Tsallis entropy rate for stationary Gaussian processes. Finally, we derive the relation between Renyi, Shannon and Tsallis entropy rates for stationary Gaussian proc...

2015
Siu-Wai Ho Sergio Verdú

Entropy is well known to be Schur concave on finite alphabets. Recently, the authors have strengthened the result by showing that for any pair of probability distributions P and Q with Q majorized by P , the entropy of Q is larger than the entropy of P by the amount of relative entropy D(P ||Q). This result applies to P and Q defined on countable alphabets. This paper shows the counterpart of t...

2018
Zhiwei Ye Juan Yang Mingwei Wang Xinlu Zong Lingyu Yan Wei Liu

Image segmentation is a significant step in image analysis and computer vision. Many entropy based approaches have been presented in this topic; among them, Tsallis entropy is one of the best performing methods. However, 1D Tsallis entropy does not consider make use of the spatial correlation information within the neighborhood results might be ruined by noise. Therefore, 2D Tsallis entropy is ...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید