Kernel Methods for Measuring Independence
نویسندگان
چکیده
We introduce two new functionals, the constrained covariance and the kernel mutual information, to measure the degree of independence of random variables. These quantities are both based on the covariance between functions of the random variables in reproducing kernel Hilbert spaces (RKHSs). We prove that when the RKHSs are universal, both functionals are zero if and only if the random variables are pairwise independent. We also show that the kernel mutual information is an upper bound near independence on the Parzen window estimate of the mutual information. Analogous results apply for two correlation-based dependence functionals introduced earlier: we show the kernel canonical correlation and the kernel generalised variance to be independence measures for universal kernels, and prove the latter to be an upper bound on the mutual information near independence. The performance of the kernel dependence functionals in measuring independence is verified in the context of independent component analysis.
منابع مشابه
Geometric Analysis of Hilbert-Schmidt Independence Criterion Based ICA Contrast Function
Since the success of Independent Component Analysis (ICA) for solving the Blind Source Separation (BSS) problem [1, 2], ICA has received considerable attention in numerous areas, such as signal processing, statistical modeling, and unsupervised learning. The performance of ICA algorithms depends significantly on the choice of the contrast function measuring statistical independence of signals a...
متن کاملMeasuring Statistical Dependence with Hilbert-Schmidt Norms
We propose an independence criterion based on the eigenspectrum of covariance operators in reproducing kernel Hilbert spaces (RKHSs), consisting of an empirical estimate of the Hilbert-Schmidt norm of the cross-covariance operator (we term this a Hilbert-Schmidt Independence Criterion, or HSIC). This approach has several advantages, compared with previous kernel-based independence criteria. Fir...
متن کاملA Data-Derived Quadratic Independence Measure for Adaptive Blind Source Recovery in Practical Applications
We present a novel performance index to measure the statistical independence of data sequences and apply it to the framework of blind source recovery (BSR) that includes blind source separation, deconvolution and equalization. This performance index is capable of measuring the mutual independence of data sequences directly from the data. This information theoretic; Quadratic Independence Measur...
متن کاملFast Kernel ICA using an Approximate Newton Method
Recent approaches to independent component analysis (ICA) have used kernel independence measures to obtain very good performance, particularly where classical methods experience difficulty (for instance, sources with near-zero kurtosis). We present fast kernel ICA (FastKICA), a novel optimisation technique for one such kernel independence measure, the Hilbert-Schmidt independence criterion (HSI...
متن کاملIs the Optimism in Optimistic Concurrency Warranted?
Optimistic synchronization allows concurrent execution of critical sections while performing dynamic conflict detection and recovery. Optimistic synchronization will increase performance only if critical regions are data independent—concurrent critical sections access disjoint data most of the time. Optimistic synchronization primitives, such as transactional memory, will improve the performanc...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- Journal of Machine Learning Research
دوره 6 شماره
صفحات -
تاریخ انتشار 2005