نتایج جستجو برای: kullback leibler distance

تعداد نتایج: 244274  

2003
Brigitte Bigi

A system that performs text categorization aims to assign appropriate categories from a predefined classification scheme to incoming documents. These assignments might be used for varied purposes such as filtering, or retrieval. This paper introduces a new effective model for text categorization with great corpus (more or less 1 million documents). Text categorization is performed using the Kul...

Journal: :Statistica Sinica 2015
Xinyu Zhang Guohua Zou Raymond J Carroll

This paper proposes a model averaging method based on Kullback-Leibler distance under a homoscedastic normal error term. The resulting model average estimator is proved to be asymptotically optimal. When combining least squares estimators, the model average estimator is shown to have the same large sample properties as the Mallows model average (MMA) estimator developed by Hansen (2007). We sho...

Journal: :Pattern Recognition Letters 2005
Frans Coetzee

A frequent practice in feature selection is to maximize the Kullback-Leibler (K-L) distance between target classes. In this note we show that this common custom is frequently suboptimal, since it fails to take into account the fact that classification occurs using a finite number of samples. In classification, the variance and higher order moments of the likelihood function should be taken into...

Journal: :ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences 2020

Journal: :Physical review. E, Statistical, nonlinear, and soft matter physics 2007
Michele Tumminello Fabrizio Lillo Rosario N Mantegna

We show that the Kullback-Leibler distance is a good measure of the statistical uncertainty of correlation matrices estimated by using a finite set of data. For correlation matrices of multivariate Gaussian variables we analytically determine the expected values of the Kullback-Leibler distance of a sample correlation matrix from a reference model and we show that the expected values are known ...

Journal: :Kybernetika 2006
Nihat Ay Andreas Knauf

Stochastic interdependence of a probablility distribution on a product space is measured by its Kullback-Leibler distance from the exponential family of product distributions (called multi-information). Here we investigate lowdimensional exponential families that contain the maximizers of stochastic interdependence in their closure. Based on a detailed description of the structure of probablili...

2009
R. Kulhavý F. J. Kraus

Regularized (stabilized) versions of exponential and linear forgetting in parameter tracking are shown to be dual to each other. Both are derived by solving essentially the same Bayesian decision-making problem where Kullback-Leibler divergence is used to measure (quasi)distance between posterior probability distributions of estimated parameters. The type of forgetting depends solely on the ord...

2018
Neda Lovričević Ðilda Pečarić Josip Pečarić

Motivated by the method of interpolating inequalities that makes use of the improved Jensen-type inequalities, in this paper we integrate this approach with the well known Zipf-Mandelbrot law applied to various types of f-divergences and distances, such are Kullback-Leibler divergence, Hellinger distance, Bhattacharyya distance (via coefficient), [Formula: see text]-divergence, total variation ...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید