نتایج جستجو برای: leibler distance

تعداد نتایج: 244184  

Journal: :Adv. Data Analysis and Classification 2010
Alfio Marazzi Victor J. Yohai

Optimal robust M-estimates of a multidimensional parameter are described using Hampel’s infinitesimal approach. The optimal estimates are derived by minimizing a measure of efficiency under the model, subject to a bounded measure of infinitesimal robustness. To this purpose we define measures of efficiency and infinitesimal sensitivity based on the Hellinger distance. We show that these two mea...

2002
Leigh J. Fitzgibbon David L. Dowe Lloyd Allison

This paper investigates the coding of change-points in the information-theoretic Minimum Message Length (MML) framework. Changepoint coding regions affect model selection and parameter estimation in problems such as time series segmentation and decision trees. The Minimum Message Length (MML) and Minimum Description Length (MDL78) approaches to change-point problems have been shown to perform w...

2016
Pavel Myshkov Simon Julier

This study explores the posterior predictive distributions obtained with various Bayesian inference methods for neural networks. The quality of the distributions is assessed both visually and quantitatively using Kullback–Leibler (KL) divergence, Kolmogorov–Smirnov (KS) distance and precision-recall scores. We perform the analysis using a synthetic dataset that allows for a more detailed examin...

2005
Ad Ridder Reuven Rubinstein

This paper describes a new idea of finding the importance sampling density in rare events simulations: the MinxEnt method (shorthand for minimum cross-entropy). Some preliminary results show that the method might be very promising. 1 The minxent program Assume • X = (X1, . . . ,Xn) is a random vector (with values denoted by x); • h is the joint density function of X; • Sj(·) (j = 1, . . . , k) ...

Journal: :J. Optimization Theory and Applications 2012
Regina Sandra Burachik C. Yalçin Kaya Shoham Sabach

We devise a new generalized univariate Newton method for solving nonlinear equations, motivated by Bregman distances and proximal regularization of optimization problems. We prove quadratic convergence of the new method, a special instance of which is the classical Newton’s method. We illustrate the possible benefits of the new method over classical Newton’s method by means of test problems inv...

2004
Stephen G. Hall James Mitchell

This paper brings together two important but hitherto largely unrelated areas of the forecasting literature, density forecasting and forecast combination. It proposes a simple data-driven approach to direct combination of density forecasts using optimal weights. These optimal weights are those weights that minimise the ‘distance’, as measured by the Kullback-Leibler Information Criterion, betwe...

2008
R. Fischer

The design of fusion diagnostics is essential for the physics program of future fusion devices. The goal is to maximize the information gain of a future experiment with respect to various constraints. A measure of information gain is the mutual information between the posterior and the prior distribution. The Kullback-Leibler distance is used as a utility function to calculate the expected info...

2008
Jan-Philip Bergeest Florian Jäger

A major problem in magnetic resonance imaging (MRI) is the lack of a pulse sequence dependent standardized intensity scale like the Hounsfield units in computed tomography. This affects the post processing of the acquired images as, in general, segmentation and registration methods depend on the observed image intensities. Different approaches dealing with this problem were proposed recently. I...

2012
Gladys D. Cacsire Barriga Francisco Louzada

In this paper we develop a Bayesian analysis for the zero-inflated regression models based on the COM-Poisson distribution. Our approach is based on Markov chain Monte Carlo methods. We discuss model selection, as well as, develop case deletion influence diagnostics for the joint posterior distribution based on the ψ-divergence, which has several divergence measures as particular cases, such as...

2000
PRASAD A. NAIK CHIH-LING TSAI

We derive a new model selection criterion for single-index models, AICC , by minimizing the expected Kullback-Leibler distance between the true and candidate models. The proposed criterion selects not only relevant variables but also the smoothing parameter for an unknown link function. Thus, it is a general selection criterion that provides a uniÞed approach to model selection across both para...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید