Abstract
Algorithms for the blind separation of sources can be derived from several different principles. This article shows that the infomax (information-maximization) principle is equivalent to the maximum likelihood. The application of the infomax principle to source separation consists of maximizing an output entropy.
Keywords
Affiliated Institutions
Related Publications
Bearing estimation in the bispectrum domain
A new array processing method is presented for bearing estimation based on the cross bispectrum of the array output data. The method is based on the asymptotic distribution of c...
Fast and robust fixed-point algorithms for independent component analysis
Independent component analysis (ICA) is a statistical method for transforming an observed multidimensional random vector into components that are statistically as independent fr...
New Approximations of Differential Entropy for Independent Component Analysis and Projection Pursuit
We derive a first-order approximation of the density of maximum entropy for a continuous 1-D random variable, given a number of simple constraints. This results in a density exp...
Redundancy reduction with information-preserving nonlinear maps
AbstractThe basic idea of linear principal component analysis (PCA) involves decorrelating coordinates by an orthogonal linear transformation. In this paper we generalize this i...
A class of neural networks for independent component analysis
Independent component analysis (ICA) is a recently developed, useful extension of standard principal component analysis (PCA). The ICA model is utilized mainly in blind separati...
Publication Info
- Year
- 1997
- Type
- article
- Volume
- 4
- Issue
- 4
- Pages
- 112-114
- Citations
- 675
- Access
- Closed
External Links
Social Impact
Social media, news, blog, policy document mentions
Citation Metrics
Cite This
Identifiers
- DOI
- 10.1109/97.566704