نتایج جستجو برای: kullback leibler distance

تعداد نتایج: 244274  

Journal: :SIAM Journal on Scientific Computing 2021

We propose to compute a sparse approximate inverse Cholesky factor $L$ of dense covariance matrix $\Theta$ by minimizing the Kullback--Leibler divergence between Gaussian distributions $\mathcal{N}(0, \Theta)$ and L^{-\top} L^{-1})$, subject sparsity constraint. Surprisingly, this problem has closed-form solution that can be computed efficiently, recovering popular Vecchia approximation in spat...

2010
Pang Du Shuangge Ma

Frailty has been introduced as a group-wise random effect to describe the within-group dependence for correlated survival data. In this article, we propose a penalized joint likelihood method for nonparametric estimation of hazard function. With the proposed method, the frailty variance component and the smoothing parameters become the tuning parameters that are selected to minimize a loss func...

2011
Jessica Kasza Patty Solomon

In this paper, we compare the performance of two methods for estimating Bayesian networks from data containing exogenous variables and random effects. The first method is fully Bayesian in which a prior distribution is placed on the exogenous variables, whereas the second method, which we call the residual approach, accounts for the effects of exogenous variables by using the notion of restrict...

Journal: :J. Global Optimization 2009
Orizon Pereira Ferreira P. Roberto Oliveira R. C. M. Silva

The convergence of primal and dual central paths associated to entropy and exponential functions, respectively, for semidefinite programming problem are studied in this paper. As an application, the proximal point method with the Kullback-Leibler distance applied to semidefinite programming problems is considered, and the convergence of primal and dual sequences is proved.

2002
Nasir Rajpoot

This paper addresses the issue of selecting features from a given wavelet packet subband decomposition that are most useful for texture classification in an image. A functional measure based on Kullback-Leibler distance is proposed as a way to select most discriminant subbands. Experimental results show a superior performance in terms of classification error rates.

Journal: :Journal of Machine Learning Research 2015
Vladimir Nikulin

In this paper we formulate in general terms an approach to prove strong consistency of the Empirical Risk Minimisation inductive principle applied to the prototype or distance based clustering. This approach was motivated by the Divisive Information-Theoretic Feature Clustering model in probabilistic space with Kullback-Leibler divergence, which may be regarded as a special case within the Clus...

Journal: :Hacettepe journal of mathematics and statistics 2022

The inequality containing Csiszár divergence on time scales is generalized for 2n2n-convex functions by using Lidstone interpolating polnomial. As an application, new entropic bounds are also computed. Several inequalities in quantum calculus and hh-discrete established. relationship between Shannon entropy, Kullback-Leibler Jeffreys distance with Zipf-Mandelbrot entropy

2010
S. Tahmasebi J. Behboodian

In the present paper Shannon’s entropy for concomitants of generalized order statistics in FGM family is obtained. Application of this result is given for order statistics, record values, k-record values, and progressive type II censored order statistics. Also, we show that the Kullback-Leibler distance among the concomitants of generalized order statistics is distributionfree.

Journal: :Signal Processing 2007
Sinan Sinanovic Don H. Johnson

Information processing theory endeavors to quantify how well signals encode information and how well systems, by acting on signals, process information. We use information-theoretic distance measures, the Kullback-Leibler distance in particular, to quantify how well signals represent information. The ratio of distances between a system’s output and input quantifies the system’s information proc...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید