TY - JOUR
ID - Sprekeler2008
T1 - {Understanding slow feature analysis: A mathematical framework}
A1 - Sprekeler, Henning
A1 - Wiskott, Laurenz
JA - Optimization
Y1 - 2008
UR - http://cogprints.org/6223/
KW - invariant representations
KW - Slow Feature Analysis
KW - statistically independent sources
KW - theoretical analysis
KW - unsupervised learning
N2 - Slow feature analysis is an algorithm for unsupervised learning of invariant representations from data with temporal correlations. Here, we present a mathematical analysis of slow feature analysis for the case where the input-output functions are not restricted in complexity. We show that the optimal functions obey a partial differential eigenvalue problem of a type that is common in theoretical physics. This analogy allows the transfer of mathematical techniques and intuitions from physics to concrete applications of slow feature analysis, thereby providing the means for analytical predictions and a better understanding of simulation results. We put particular emphasis on the situation where the input data are generated from a set of statistically independent sources. The dependence of the optimal functions on the sources is calculated analytically for the cases where the sources have Gaussian or uniform distribution.
M1 - bdsk-url-1={http://cogprints.org/6223/}
M1 -
date-added={2012-09-23 10:50:23 +0200}
M1 -
date-modified={2012-09-23 10:50:23 +0200}
M1 -
file={:home/jim/Desktop/sortedLiterature/sfa+invariances/SprekelerWiskott08-Understanding Slow Feature Analysis$\backslash$: A Mathematical Framework.pdf:pdf}
M1 -
project={fremdliteratur}
ER -