Independent Components Analysis by Direct Entropy Minimization
Erik G. Miller and John W. III Fisher
EECS Department, University of California, Berkeley
Technical Report No. UCB/CSD-03-1221
, 2003
http://www2.eecs.berkeley.edu/Pubs/TechRpts/2003/CSD-03-1221.pdf
This paper presents a new algorithm for the independent components analysis (ICA) problem based on efficient entropy estimates. Like many previous methods, this algorithm directly minimizes the measure of departure from independence according to the estimated Kullback-Leibler divergence between the joint distribution and the product of the marginal distributions. We pair this approach with efficient entropy estimators from the statistics literature. In particular, the entropy estimator we use is consistent and exhibits rapid convergence. The algorithm based on this estimator is simple, computationally efficient, intuitively appealing, and outperforms other well known algorithms. In addition, the estimator's relative insensitivity to outliers translates into superior performance by our ICA algorithm on outlier tests. We present favorable comparisons to the Kernel ICA, FAST-ICA, JADE, and extended Infomax algorithms in extensive simulations.
BibTeX citation:
@techreport{Miller:CSD-03-1221, Author= {Miller, Erik G. and Fisher, John W. III}, Title= {Independent Components Analysis by Direct Entropy Minimization}, Year= {2003}, Month= {Jan}, Url= {http://www2.eecs.berkeley.edu/Pubs/TechRpts/2003/5437.html}, Number= {UCB/CSD-03-1221}, Abstract= {This paper presents a new algorithm for the independent components analysis (ICA) problem based on efficient entropy estimates. Like many previous methods, this algorithm directly minimizes the measure of departure from independence according to the estimated Kullback-Leibler divergence between the joint distribution and the product of the marginal distributions. We pair this approach with efficient entropy estimators from the statistics literature. In particular, the entropy estimator we use is consistent and exhibits rapid convergence. The algorithm based on this estimator is simple, computationally efficient, intuitively appealing, and outperforms other well known algorithms. In addition, the estimator's relative insensitivity to outliers translates into superior performance by our ICA algorithm on outlier tests. We present favorable comparisons to the Kernel ICA, FAST-ICA, JADE, and extended Infomax algorithms in extensive simulations.}, }
EndNote citation:
%0 Report %A Miller, Erik G. %A Fisher, John W. III %T Independent Components Analysis by Direct Entropy Minimization %I EECS Department, University of California, Berkeley %D 2003 %@ UCB/CSD-03-1221 %U http://www2.eecs.berkeley.edu/Pubs/TechRpts/2003/5437.html %F Miller:CSD-03-1221