Sparse, Smooth and Demixed Dimensionality Reduction for High-dimensional Neural Time Series Datasets

Alexander Williams, Stanford University

Photo of Alexander Williams

A variety of recent technological advances have enabled experimental neuroscientists to simultaneously record the activity of hundreds or even thousands of neurons. Furthermore, these recordings often can be made over long time periods (days or weeks) as an animal learns and performs complex behavioral tasks. The resulting data sets contain structure across multiple time scales and experimental dimensions. For example, animals often perform behavior for a reward over sequential trials. Within each trial, rapid fluctuations in neural firing rate may be a signature of the animal's short-term decisions and behaviors, while more gradual processes, such those involved in learning and memory, would likely only be detectable by carefully analyzing activity across trials.

We present a general strategy to represent these multifactorial data sets as multidimensional arrays and develop techniques to untangle simple latent dynamics across dimensions of experimental interest. Our approach is closely related to a recently developed technique called demixed PCA [1]. We characterize novel variants of this technique with roughness and sparsity penalties. We also compare and draw connections of demixed PCA to canonical polyadic (CP) and other tensor decompositions [2].

We applied this analysis framework to detect the neural correlates of cognitive strategy shifts in the medial prefrontal cortex of mice. Over multiple days, mice repeatedly learned to switch between an allocentric (spatial) strategy and an egocentric (procedural) strategy. This switch in behavioral strategy was accompanied by rapid transitions in calcium dynamics at both the single neuron and network level.

[1] Dmitry Kobak el al. Demixed principal component analysis of neural population data. eLife, 5:e10989, April 2016

[2] Tamara G. Kolda and Brett W. Bader. Tensor decompositions and applications. SIAM Review, 51(3):455–500, September 2009

Abstract Author(s): A.H. Williams, F.L. Wang, T.H. Kim, M.J. Schnitzer, S. Ganguli