"Weighting for more": Enhancing characteristic-function based ICA with asymptotically optimal weighting

نویسندگان

  • Alon Slapak
  • Arie Yeredor
چکیده

The CHaracteristic-function-Enabled Source Separation (CHESS) method for independent component analysis (ICA) is based on approximate joint diagonalization (AJD) of Hessians of the observations’ empirical log-characteristicfunction, taken at selected off-origin “processing points”. As previously observed in other contexts, the AJD performance can be significantly improved by optimal weighting, using the inverse of the covariance matrix of all of the off-diagonal terms of the target-matrices. Fortunately, this apparently cumbersome weighting scheme takes a convenient form under the assumption that the mixture is already “nearly-separated”, e.g., following some initial separation. We derive covariance expressions for the Sample-Hessian matrices, and show that under the near-separation assumption, the weight matrix takes a nearly block-diagonal form, conveniently exploited by the recently proposed WEighted Diagonalization using Gauss itErations (WEDGE) algorithm for weighted AJD. Using our expressions, the weight matrix can be estimated directly from the data leading to our WeIghTed CHESS (WITCHESS) algorithm. Simulation results demonstrate how WITCHESS can lead to significant performance improvement, not only over unweighted CHESS, but also over other ICA methods.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

On weighted local fitting and its relation to the Horvitz-Thompson estimator

Weighting is a largely used concept in many fields of statistics and has frequently caused controversies on its justification and profit. In this paper, we analyze a weighted version of the well-known local polynomial regression estimators, derive their asymptotic bias and variance, and find that the conflict between the asymptotically optimal weighting scheme and the practical requirements has...

متن کامل

Threading without optimizing weighting factors for scoring function.

Optimizing weighting factors for a linear combination of terms in a scoring function is a crucial step for success in developing a threading algorithm. Usually weighting factors are optimized to yield the highest success rate on a training dataset, and the determined constant values for the weighting factors are used for any target sequence. Here we explore completely different approaches to ha...

متن کامل

A Novel Weighting Method Using Fuzzy Set Theory for Spatial Adaptive Patch-based Image Denoising

Based on fuzzy set theory, this paper proposes a novel weighting method for spatial adaptive patch-based image denoising algorithm which can be considered as an extension of nonlocal means filtering. First, a fuzzy clustering algorithm for weighting data points is applied to reduce the estimate bias which arises from the unrelated points. The weighting function is determined by optimal fuzzy pa...

متن کامل

Sequential Weighting Algorithms for Multi-Alphabet Sources∗

The sequential Context Tree Weighting procedure [6] achieves the asymptotically optimal redundancy behavior of k/2 · (log n)/n, where k is the number of free parameters of the source and n is the sequence length. For FSMX sources with an alphabet A, this number, k, is (|A| − 1) · |Ka|. However, especially the FSMX sources often use only a few possible letters in every state. It would be nice if...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • Signal Processing

دوره 91  شماره 

صفحات  -

تاریخ انتشار 2011