[BibTeX] [RIS]
{Understanding slow feature analysis: A mathematical framework}
Type of publication: Article
Citation: Sprekeler2008
Journal: Optimization
Year: 2008
URL: http://cogprints.org/6223/...
Abstract: Slow feature analysis is an algorithm for unsupervised learning of invariant representations from data with temporal correlations. Here, we present a mathematical analysis of slow feature analysis for the case where the input-output functions are not restricted in complexity. We show that the optimal functions obey a partial differential eigenvalue problem of a type that is common in theoretical physics. This analogy allows the transfer of mathematical techniques and intuitions from physics to concrete applications of slow feature analysis, thereby providing the means for analytical predictions and a better understanding of simulation results. We put particular emphasis on the situation where the input data are generated from a set of statistically independent sources. The dependence of the optimal functions on the sources is calculated analytically for the cases where the sources have Gaussian or uniform distribution.
Userfields: bdsk-url-1={http://cogprints.org/6223/}, date-added={2012-09-23 10:50:23 +0200}, date-modified={2012-09-23 10:50:23 +0200}, file={:home/jim/Desktop/sortedLiterature/sfa+invariances/SprekelerWiskott08-Understanding Slow Feature Analysis$\backslash$: A Mathematical Framework.pdf:pdf}, project={fremdliteratur},
Keywords: invariant representations, Slow Feature Analysis, statistically independent sources, theoretical analysis, unsupervised learning
Authors Sprekeler, Henning
Wiskott, Laurenz
  • http://cogprints.org/6223/