نتایج جستجو برای: leibler distance
تعداد نتایج: 244184 فیلتر نتایج به سال:
In this paper we prove the optimality of an aggregation procedure. We prove lower bounds for aggregation of model selection type of M density estimators for the Kullback-Leibler divergence (KL), the Hellinger’s distance and the L1-distance. The lower bound, with respect to the KL distance, can be achieved by the on-line type estimate suggested, among others, by Yang (2000a). Combining these res...
The inequality containing Csiszár divergence on time scales is generalized for 2n2n-convex functions by using Lidstone interpolating polnomial. As an application, new entropic bounds are also computed. Several inequalities in quantum calculus and hh-discrete established. relationship between Shannon entropy, Kullback-Leibler Jeffreys distance with Zipf-Mandelbrot entropy
Misspelled query due to homophones or mispronunciation is difficult to be corrected in the conventional spelling correction methods. In phonetic candidate generation, the generator is to produce candidates which are phonetically similar to a given query. In this paper, we present a new phonetic candidate generator for improving the search efficiency of a query. The proposed generator consists o...
Optimal stability estimates in the class of regularized distributions are derived for the characterization of normal laws in Cramer’s theorem with respect to relative entropy and Fisher information distance.
The aim of this paper is to present two moment matching procedures for basketoptions pricing and to test its distributional approximations via distances on the space of probability densities, the Kullback-Leibler information (KLI) and the Hellinger distance (HD). We are interested in measuring the KLI and the HD between the real simulated basket terminal distribution and the distributions used ...
Distance measures between quantum states like the trace distance and the fidelity can naturally be defined by optimizing a classical distance measure over all measurement statistics that can be obtained from the respective quantum states. In contrast, Petz showed that the measured relative entropy, defined as a maximization of the Kullback-Leibler divergence over projective measurement statisti...
The Bayesian variable selection method proposed in the paper is based on the evaluation of the Kullback-Leibler distance between the full (or encompassing) model and the submodels. The implementation of the method does not require a separate prior modeling on the submodels since the corresponding parameters for the submodels are deened as the Kullback-Leibler projections of the full model param...
In most Information Retrieval (IR) applications, Euclidean distance is used for similarity measurement. It is adequate in many cases but this distance metric is not very accurate when there exist some different local data distributions in the database. We propose a Gaussian mixture distance for performing accurate nearest-neighbor search for Information Retrieval (IR). Under an established Gaus...
The extension of classical analysis to time series data is the basic problem faced in many fields, such as engineering, economic and medicine. The main objective of discriminant time series analysis is to examine how far it is possible to distinguish between various groups. There are two situations to be considered in the linear time series models. Firstly when the main discriminatory informati...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید