نتایج جستجو برای: leibler distance

تعداد نتایج: 244184  

2009
Rudolf Kulhavý

The role of Kerridge inaccuracy, Shannon entropy and Kullback-Leibler distance in statistical estimation is shown for both dicrete and continuous observations. The cases of data independence and regression-type dependence are considered in parallel. Pythagorean-like relations valid for probability distributions are presented and their importance for estimation under compressed data is indicated.

1997
Lin Xu

We develop a new framework to calibrate stochastic volatility option pricing models to an arbitrary prescribed set of prices of liquidly traded options. Our approach produces an arbitrage-free stochastic volatility di usion process that minimizes the distance to a prior di usion model. We use the notion of relative entropy (also known under the name of Kullback-Leibler distance) to quantify the...

Journal: :J. Global Optimization 2009
Orizon Pereira Ferreira P. Roberto Oliveira R. C. M. Silva

The convergence of primal and dual central paths associated to entropy and exponential functions, respectively, for semidefinite programming problem are studied in this paper. As an application, the proximal point method with the Kullback-Leibler distance applied to semidefinite programming problems is considered, and the convergence of primal and dual sequences is proved.

2002
Nasir Rajpoot

This paper addresses the issue of selecting features from a given wavelet packet subband decomposition that are most useful for texture classification in an image. A functional measure based on Kullback-Leibler distance is proposed as a way to select most discriminant subbands. Experimental results show a superior performance in terms of classification error rates.

Journal: :Journal of Machine Learning Research 2015
Vladimir Nikulin

In this paper we formulate in general terms an approach to prove strong consistency of the Empirical Risk Minimisation inductive principle applied to the prototype or distance based clustering. This approach was motivated by the Divisive Information-Theoretic Feature Clustering model in probabilistic space with Kullback-Leibler divergence, which may be regarded as a special case within the Clus...

2010
S. Tahmasebi J. Behboodian

In the present paper Shannon’s entropy for concomitants of generalized order statistics in FGM family is obtained. Application of this result is given for order statistics, record values, k-record values, and progressive type II censored order statistics. Also, we show that the Kullback-Leibler distance among the concomitants of generalized order statistics is distributionfree.

2011
Ahmed Drissi El Maliani Mohammed El Hassouni Noureddine Lasmar Yannick Berthoumieu Driss Aboutajdine

This paper presents a new similarity measure based on Rao distance for color texture classification or retrieval. Textures are characterized by a joint model of complex wavelet coefficients. This model is based on a Gaussian Copula in order to consider the dependency between color components. Then, a closed form of Rao distance is computed to measure the difference between two Gaussian Copula b...

Journal: :Simulation 2007
Zdravko I. Botev Dirk P. Kroese Thomas Taimre

The cross-entropy and minimum cross-entropy methods are well-known Monte Carlo simulation techniques for rare-event probability estimation and optimization. In this paper, we investigate how these methods can be extended to provide a general non-parametric cross-entropy framework based on 1-divergence distance measures. We show how the 2 2 distance, in particular, yields a viable alternative to...

2001
A. Lazarevic D. Pokrajac V. Megalooikonomou Z. Obradovic

To facilitate the process of discovering brain structure-function associations from image and clinical data, we have developed classification tools for brain image data that are based on measures of dissimilarity between probability distributions. We propose statistical as well as non-statistical methods for classifying three dimensional probability distributions of regions of interest (ROIs) i...

2003
Hermann Ney

We present two novel bounds for the classification error that, at the same time, can be used as practical training criteria. Unlike the bounds reported in the literature so far, these novel bounds are based on a strict distinction between the true but unknown distribution and the model distribution, which is used in the decision rule. The two bounds we derive are the squared distance and the Ku...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید