نتایج جستجو برای: kullback leibler distance

تعداد نتایج: 244274  

The extension of classical analysis to time series data is the basic problem faced in many fields, such as engineering, economic and medicine. The main objective of discriminant time series analysis is to examine how far it is possible to distinguish between various groups. There are two situations to be considered in the linear time series models. Firstly when the main discriminatory informati...

2011
Bo Peng Yao Qian Frank K. Soong Bo Zhang

Misspelled query due to homophones or mispronunciation is difficult to be corrected in the conventional spelling correction methods. In phonetic candidate generation, the generator is to produce candidates which are phonetically similar to a given query. In this paper, we present a new phonetic candidate generator for improving the search efficiency of a query. The proposed generator consists o...

2016
Philip S. Thomas Bruno Castro da Silva Christoph Dann Emma Brunskill

We propose a new class of algorithms for minimizing or maximizing functions of parametric probabilistic models. These new algorithms are natural gradient algorithms that leverage more information than prior methods by using a new metric tensor in place of the commonly used Fisher information matrix. This new metric tensor is derived by computing directions of steepest ascent where the distance ...

Journal: :Applied Intelligence 2023

In this article, we address the issues of stability and data-efficiency in reinforcement learning (RL). A novel RL approach, Kullback-Leibler divergence-regularized distributional (KL-C51) is proposed to integrate advantages both (KL) one framework. KL-C51 derived Bellman equation TD errors regularized by KL divergence a perspective explored approximated strategies properly mapping correspondin...

2015
Sergey Bobkov Gennadiy Chistyakov

Optimal stability estimates in the class of regularized distributions are derived for the characterization of normal laws in Cramer’s theorem with respect to relative entropy and Fisher information distance.

2001
Damiano Brigo Fabio Mercurio Francesco Rapisarda Rita Scotti

The aim of this paper is to present two moment matching procedures for basketoptions pricing and to test its distributional approximations via distances on the space of probability densities, the Kullback-Leibler information (KLI) and the Hellinger distance (HD). We are interested in measuring the KLI and the HD between the real simulated basket terminal distribution and the distributions used ...

Journal: :CoRR 2016
Kajsa Møllersen Subhra S. Dhar Fred Godtliebsen

Hybrid clustering combines partitional and hierarchical clustering for computational effectiveness and versatility in cluster shape. In such clustering, a dissimilarity measure plays a crucial role in the hierarchical merging. The dissimilarity measure has great impact on the final clustering, and data-independent properties are needed to choose the right dissimilarity measure for the problem a...

2008
Pedro Miguel Correia Guerreiro

We propose new algorithms for computing linear discriminants to perform data dimensionality reduction from R to R, with p < n. We propose alternatives to the classical Fisher’s Distance criterion, namely, we investigate new criterions based on the: Chernoff-Distance, J-Divergence and Kullback-Leibler Divergence. The optimization problems that emerge of using these alternative criteria are non-c...

Journal: :Foundations and Trends in Communications and Information Theory 2004
Imre Csiszár Paul C. Shields

This tutorial is concerned with applications of information theory concepts in statistics, in the finite alphabet setting. The information measure known as information divergence or Kullback-Leibler distance or relative entropy plays a key role, often with a geometric flavor as an analogue of squared Euclidean distance, as in the concepts of I-projection, I-radius and I-centroid. The topics cov...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید