نتایج جستجو برای: the upper divergence

تعداد نتایج: 16077010  

2016
Po-Ning Chen Fady Alajaji Hsin Chu

Expressions for "-entropy rate, "-mutual information rate and "-divergence rate are introduced. These quantities, which consist of the quantiles of the asymptotic information spectra, generalize the inf/sup-entropy/information/divergence rates of Han and Verd u. The algebraic properties of these information measures are rigorously analyzed, and examples illustrating their use in the computation...

2017
Po-Ning Chen Fady Alajaji

| General formulas for entropy, mutual information, and divergence are established. It is revealed that these quantities are actually determined by three decisive sequences of random variables; which are, respectively, the normalized source information density, the normalized channel information density, and the normalized log-likelihood ratio. In terms of the ultimate cumulative distribution f...

Journal: :Quantum 2021

We introduce a new quantum R\'enyi divergence $D^{\#}_{\alpha}$ for $\alpha \in (1,\infty)$ defined in terms of convex optimization program. This has several desirable computational and operational properties such as an efficient semidefinite programming representation states channels, chain rule property. An important property this is that its regularization equal to the sandwiched (also known...

Journal: :Math. Comput. 2009
Long Chen Michael J. Holst Jinchao Xu

The convergence and optimality of adaptive mixed finite element methods for the Poisson equation are established in this paper. The main difficulty for mixed finite element methods is the lack of minimization principle and thus the failure of orthogonality. A quasi-orthogonality property is proved using the fact that the error is orthogonal to the divergence free subspace, while the part of the...

2010
Yury Polyanskiy Sergio Verdú

Arimoto [1] proved a non-asymptotic upper bound on the probability of successful decoding achievable by any code on a given discrete memoryless channel. In this paper we present a simple derivation of the Arimoto converse based on the dataprocessing inequality for Rényi divergence. The method has two benefits. First, it generalizes to codes with feedback and gives the simplest proof of the stro...

Journal: :Physical review letters 1994
Lavrentovich Pergamenshchik

To verify the status of the divergence K» term in the elastic theory of liquid crystals we study submicron films placed onto an isotropic fluid substrate in the Langmuir trough (Langmuir liquid crystal, LLC). The upper and lower surfaces favor normal and tangential molecular orientation, respectively. The periodic domain phase is observed in a nematic LLC. The dependence of the periodicity L of...

2015
Junpei Komiyama Junya Honda Hisashi Kashima Hiroshi Nakagawa

We study the K-armed dueling bandit problem, a variation of the standard stochastic bandit problem where the feedback is limited to relative comparisons of a pair of arms. We introduce a tight asymptotic regret lower bound that is based on the information divergence. An algorithm that is inspired by the Deterministic Minimum Empirical Divergence algorithm (Honda and Takemura, 2010) is proposed,...

2003
A. Ben Hamza Hamid Krim

Information theoretic measures provide quantitative entropic divergences between two probability distributions or data sets. In this paper, we analyze the theoretical properties of the Jensen-Rényi divergence which is defined between any arbitrary number of probability distributions. Using the theory of majorization, we derive its maximum value, and also some performance upper bounds in terms o...

Journal: :CoRR 2017
Maksim E. Shirokov

We present a family of easily computable upper bounds for the Holevo quantity of ensemble of quantum states depending on a reference state as a free parameter. These upper bounds are obtained by combining probabilistic and metric characteristics of the ensemble. We show that appropriate choice of the reference state gives tight upper bounds for the Holevo quantity which in many cases improve ex...

Journal: :J. Electronic Imaging 2006
A. Ben Hamza

We propose a nonextensive information-theoretic measure called Jensen-Tsallis divergence, which may be defined between any arbitrary number of probability distributions, and we analyze its main theoretical properties. Using the theory of majorization, we also derive its upper bounds performance. To gain further insight into the robustness and the application of the Jensen-Tsallis divergence mea...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید