نتایج جستجو برای: leibler distance

تعداد نتایج: 244184  

Journal: :Remote Sensing 2023

Under external load excitation, damage such as breathing cracks and bolt loosening will cause structural time domain acceleration to have nonlinear features. To solve the problem of identification, a identification method based on Kullback–Leibler (KL) distance model residuals is proposed in this paper. First, an autoregressive (AR) order was selected using autocorrelation function (ACF) Akaike...

2010
Pang Du Shuangge Ma

Frailty has been introduced as a group-wise random effect to describe the within-group dependence for correlated survival data. In this article, we propose a penalized joint likelihood method for nonparametric estimation of hazard function. With the proposed method, the frailty variance component and the smoothing parameters become the tuning parameters that are selected to minimize a loss func...

2011
Jessica Kasza Patty Solomon

In this paper, we compare the performance of two methods for estimating Bayesian networks from data containing exogenous variables and random effects. The first method is fully Bayesian in which a prior distribution is placed on the exogenous variables, whereas the second method, which we call the residual approach, accounts for the effects of exogenous variables by using the notion of restrict...

Journal: :Applicable Analysis 2022

In this work, we apply the Bayesian approach for acoustic scattering problem to reconstruct shape of a sound-soft obstacle using limited-aperture far-field measure data. A novel total variation prior scheme is developed parameterization. It imposed on Fourier coefficients parameterization not itself. Using prior, some less smooth objects can be reconstructed. We also investigate well-posedness ...

2016
Philip S. Thomas Bruno Castro da Silva Christoph Dann Emma Brunskill

We propose a new class of algorithms for minimizing or maximizing functions of parametric probabilistic models. These new algorithms are natural gradient algorithms that leverage more information than prior methods by using a new metric tensor in place of the commonly used Fisher information matrix. This new metric tensor is derived by computing directions of steepest ascent where the distance ...

2007
Patrick Marsh Peter Phillips Robert Taylor

This paper details the differential and numeric properties of two measures of entropy, Shannon entropy and Kullback-Leibler distance, applicable for the unit root hypothesis. It is found that they are differentiable functions of the degree of trending in any included deterministic component and of the correlation of the underlying innovations. Moreover, Shannon entropy is concave in these, and ...

Journal: :Physical review. E, Statistical, nonlinear, and soft matter physics 2010
Saar Rahav Shaul Mukamel

By subjecting a dynamical system to a series of short pulses and varying several time delays, we can obtain multidimensional characteristic measures of the system. Multidimensional Kullback-Leibler response function (KLRF), which are based on the Kullback-Leibler distance between the initial and final states, are defined. We compare the KLRF, which are nonlinear in the probability density, with...

2008
Pedro Miguel Correia Guerreiro

We propose new algorithms for computing linear discriminants to perform data dimensionality reduction from R to R, with p < n. We propose alternatives to the classical Fisher’s Distance criterion, namely, we investigate new criterions based on the: Chernoff-Distance, J-Divergence and Kullback-Leibler Divergence. The optimization problems that emerge of using these alternative criteria are non-c...

Journal: :Foundations and Trends in Communications and Information Theory 2004
Imre Csiszár Paul C. Shields

This tutorial is concerned with applications of information theory concepts in statistics, in the finite alphabet setting. The information measure known as information divergence or Kullback-Leibler distance or relative entropy plays a key role, often with a geometric flavor as an analogue of squared Euclidean distance, as in the concepts of I-projection, I-radius and I-centroid. The topics cov...

2017
Herbert Edelsbrunner Hubert Wagner

Given a finite set in a metric space, the topological analysis generalizes hierarchical clustering using a 1-parameter family of homology groups to quantify connectivity in all dimensions. Going beyond Euclidean distance and really beyond metrics, we show that the tools of topological data analysis also apply when we measure distance with Bregman divergences. While these divergences violate two...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید