Time-warped Principal Components Analysis of Neural Data

Alexander Williams, Stanford University

Photo of Alexander Williams

Analysis of multi-trial neural data often relies on rigid alignment of neural activity to stimulus triggers or behavioral events. However, activity on a single trial may be shifted and skewed in time due to differences in attentional state, biophysical kinetics and other unobserved latent variables. This temporal variability can inflate the apparent dimensionality of data and obscure our ability to recover inherently simple, low-dimensional structure. For example, small temporal shifts on each trial introduce illusory dimensions as revealed by principal component analysis (PCA). We demonstrate the prevalence of these issues in spike-triggered analysis of retinal ganglion cells and in primate motor cortical neurons during a reaching task. To address these challenges, we develop a novel method, time-warped PCA (twPCA), that simultaneously identifies time warps of individual trials and low-dimensional structure across neurons and time. Our method contains a single hyperparameter that trades off complexity of the temporal warps against the dimensionality of the aligned data. Furthermore, we identify the temporal warping in a data-driven, unsupervised manner, removing the need for explicit alignment with external variables. We apply twPCA to motor cortical data recorded from a monkey performing a center-out delayed reaching task. The learned warpings can explain 70 percent of the variability in reaction time. Time-warped PCA is broadly applicable to a variety of neural systems as a method for disentangling temporal variability across trials as well as discovering underlying neural dynamics and structure of interest.

Abstract Author(s): Alex H. Williams, Ben Poole, Niru Maheswaranathan, Surya Ganguli