Differential learning algorithms for decorrelation and independent component analysis
نویسنده
چکیده
Decorrelation and its higher-order generalization, independent component analysis (ICA), are fundamental and important tasks in unsupervised learning, that were studied mainly in the domain of Hebbian learning. In this paper we present a variation of the natural gradient ICA, differential ICA, where the learning relies on the concurrent change of output variables. We interpret the differential learning as the maximum likelihood estimation of parameters with latent variables represented by the random walk model. In such a framework, we derive the differential ICA algorithm and, in addition, we also present the differential decorrelation algorithm that is treated as a special instance of the differential ICA. Algorithm derivation and local stability analysis are given with some numerical experimental results.
منابع مشابه
Differential ICA
As an alternative to the conventional Hebb-type unsupervised learning, differential learning was studied in the domain of Hebb’s rule [1] and decorrelation [2]. In this paper we present an ICA algorithm which employs differential learning, thus named as differential ICA. We derive a differential ICA algorithm in the framework of maximum likelihood estimation and random walk model. Algorithm der...
متن کاملUndercomplete Blind Subspace Deconvolution
Here, we introduce the blind subspace deconvolution (BSSD) problem, which is the extension of both the blind source deconvolution (BSD) and the independent subspace analysis (ISA) tasks. We treat the undercomplete BSSD (uBSSD) case. Applying temporal concatenation we reduce this problem to ISA. The associated ‘high dimensional’ ISA problem can be handled by a recent technique called joint f-dec...
متن کاملOne-unit Learning Rules for Independent Component Analysis
Neural one-unit learning rules for the problem of Independent Component Analysis (ICA) and blind source separation are introduced. In these new algorithms, every ICA neuron develops into a separator that finds one of the independent components. The learning rules use very simple constrained Hebbianjanti-Hebbian learning in which decorrelating feedback may be added. To speed up the convergence o...
متن کاملBlind Source Separation and Independent Component Analysis: A Review
Blind source separation (BSS) and independent component analysis (ICA) are generally based on a wide class of unsupervised learning algorithms and they found potential applications in many areas from engineering to neuroscience. A recent trend in BSS is to consider problems in the framework of matrix factorization or more general signals decomposition with probabilistic generative and tree stru...
متن کاملDependence, Correlation and Gaussianity in Independent Component Analysis
Independent component analysis (ICA) is the decomposition of a random vector in linear components which are “as independent as possible.” Here, “independence” should be understood in its strong statistical sense: it goes beyond (second-order) decorrelation and thus involves the nonGaussianity of the data. The ideal measure of independence is the “mutual information” and is known to be related t...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- Neural networks : the official journal of the International Neural Network Society
دوره 19 10 شماره
صفحات -
تاریخ انتشار 2006