نتایج جستجو برای: rényi

تعداد نتایج: 6772  

Journal: :Stats 2021

This paper deals with measuring the Bayesian robustness of classes contaminated priors. Two different priors in neighborhood elicited prior are considered. The first one is well-known ϵ-contaminated class, while second geometric mixing class. proposed measure based on computing curvature Rényi divergence between posterior distributions. Examples used to illustrate results by using simulated and...

2001
Yun He A. Ben Hamza Hamid Krim

Entropy-based divergence measures have shown promising results in many areas of engineering and image processing. In this paper, a generalized information-theoretic measure called Jensen-Rényi divergence is proposed. Some properties such as convexity and its upper bound are derived. Using the Jensen-Rényi divergence, we propose a new approach to the problem of ISAR (Inverse Synthetic Aperture R...

2012
Yilun Shang

A common property of many, though not all, massive real-world networks, including the WorldWide Web, the Internet, and social networks, is that the connectivity of the various nodes follows a scale-free distribution, P(k) ∝ k−α , with typical scaling exponent 2≤ α ≤ 3. In this letter, we prove that the Erdős–Rényi random graph with unbounded expected degrees has a scale-free behaviour with scal...

2003
A. Ben Hamza Hamid Krim

Information theoretic measures provide quantitative entropic divergences between two probability distributions or data sets. In this paper, we analyze the theoretical properties of the Jensen-Rényi divergence which is defined between any arbitrary number of probability distributions. Using the theory of majorization, we derive its maximum value, and also some performance upper bounds in terms o...

2010
Balaji Vasan Srinivasan Ramani Duraiswami Dmitry N. Zotkin

Speaker recognition systems classify a test signal as a speaker or an imposter by evaluating a matching score between input and reference signals. We propose a new information theoretic approach for computation of the matching score using the Rényi entropy. The proposed entropic distance, the Kernelized Rényi distance (KRD), is formulated in a non-parametric way and the resulting measure is eff...

Journal: :Entropy 2011
Andreia Teixeira Armando Matos André Souto Luis Filipe Coelho Antunes

Kolmogorov complexity and Shannon entropy are conceptually different measures. However, for any recursive probability distribution, the expected value of Kolmogorov complexity equals its Shannon entropy, up to a constant. We study if a similar relationship holds for Rényi and Tsallis entropies of order α, showing that it only holds for α = 1. Regarding a time-bounded analogue relationship, we s...

2016
Luc Bégin Pascal Germain François Laviolette Jean-Francis Roy

We propose a simplified proof process for PAC-Bayesian generalization bounds, that allows to divide the proof in four successive inequalities, easing the “customization” of PAC-Bayesian theorems. We also propose a family of PAC-Bayesian bounds based on the Rényi divergence between the prior and posterior distributions, whereas most PACBayesian bounds are based on the KullbackLeibler divergence....

2001
Richard M. Dansereau Witold Kinsner

This paper introduces a new class of fractal dimension measures which we call relative multifractal measures. The relative multifractal measures developed are formed through a melding of the Rényi dimension spectrum, which is based on the Rényi generalized entropy, and relative entropy as given with the Kullback-Leibler distance. This new class of multifractal measures is then used to find the ...

2011
ION NECHITA

In this paper we obtain new bounds for the minimum output entropies of random quantum channels. These bounds rely on random matrix techniques arising from free probability theory. We then revisit the counterexamples developed by Hayden and Winter to get violations of the additivity equalities for minimum output Rényi entropies. We show that random channels obtained by randomly coupling the inpu...

2010
Yury Polyanskiy Sergio Verdú

Arimoto [1] proved a non-asymptotic upper bound on the probability of successful decoding achievable by any code on a given discrete memoryless channel. In this paper we present a simple derivation of the Arimoto converse based on the dataprocessing inequality for Rényi divergence. The method has two benefits. First, it generalizes to codes with feedback and gives the simplest proof of the stro...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید