نتایج جستجو برای: kullback
تعداد نتایج: 7189 فیلتر نتایج به سال:
In this article, a method based on a non-parametric estimation of the Kullback–Leibler divergence using a local feature space is proposed for synthetic aperture radar (SAR) image change detection. First, local features based on a set of Gabor filters are extracted from both preand post-event images. The distribution of these local features from a local neighbourhood is considered as a statistic...
We show that the Kullback-Leibler distance is a good measure of the statistical uncertainty of correlation matrices estimated by using a finite set of data. For correlation matrices of multivariate Gaussian variables we analytically determine the expected values of the Kullback-Leibler distance of a sample correlation matrix from a reference model and we show that the expected values are known ...
We propose a method to measure real-valued time series irreversibility which combines two different tools: the horizontal visibility algorithm and the Kullback-Leibler divergence. This method maps a time series to a directed network according to a geometric criterion. The degree of irreversibility of the series is then estimated by the Kullback-Leibler divergence (i.e. the distinguishability) b...
Estimation of Kullback-Leibler amount of information is a crucial part of deriving a statistical model selection procedure which is based on likelihood principle like AIC. To discriminate nested models, we have to estimate it up to the order of constant while the Kullback-Leibler information itself is of the order of the number of observations. A correction term employed in AIC is an example to...
Following the work of Hurvich, Shumway, and Tsai (1990), we propose an “improved” variant of the Akaike information criterion, AICi, for state-space model selection. The variant is based on Akaike’s (1973) objective of estimating the Kullback-Leibler information (Kullback 1968) between the densities corresponding to the fitted model and the generating or true model. The development of AICi proc...
We provide background information to allow a heuristic understanding of two types of criteria used in selecting a model for making inferences from ringing data. The first type of criteria (e.g., AIC, AIC, QAIC and TIC) are estimates of (relative) Kullback-Leibler information or-distance and attempt to select a good approximating model for inference, based on the Principle of Parsimony. The seco...
The Wasserstein probability metric has received much attention from the machine learning community. Unlike the Kullback-Leibler divergence, which strictly measures change in probability, the Wasserstein metric reflects the underlying geometry between outcomes. The value of being sensitive to this geometry has been demonstrated, among others, in ordinal regression and generative modelling. In th...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید