Independent Component Analysis (ICA) is an essential building block for data analysis in many applications. Selecting the truly meaningful components from the result of an ICA algorithm, or comparing the results of different algorithms, however, are non-trivial problems. We introduce a very general technique for evaluating ICA results rooted in information-theoretic model selection. The basic idea is to exploit the natural link between non-Gaussianity and data compression: The better the data transformation represented by one or several ICs improves the effectiveness of data compression, the higher is the relevance of the ICs. In an extensive experimental evaluation we demonstrate that our novel information-theoretic measure robustly selects the most interesting components from data without requiring any assumptions or thresholds.
«
Independent Component Analysis (ICA) is an essential building block for data analysis in many applications. Selecting the truly meaningful components from the result of an ICA algorithm, or comparing the results of different algorithms, however, are non-trivial problems. We introduce a very general technique for evaluating ICA results rooted in information-theoretic model selection. The basic idea is to exploit the natural link between non-Gaussianity and data compression: The better the data tr...
»