Using information theory to analyze statistically causal influences in networks of simultaneously recorded neurons

Christopher Quinn, University of Illinois at Urbana-Champaign

Photo of Christopher Quinn

Improvements in electrode recording technologies have enabled neuroscience researchers to simultaneously record individual neural spike trains from large numbers of neurons. Many researchers have attempted to identify causal relationships between simultaneously recorded neurons, but often use methods that are not robust. Recently, an information theoretic quantity, known as directed information, was introduced as a robust measure of statistically causal relationships between stochastic processes, such as neural activity. A consistent estimation procedure for identifying all of the statistically causal relationships between neurons in a network was developed using the recorded spike trains. For large numbers of recorded neurons, there could be many statistically causal relationships between them. It can be difficult to determine which are the most important connections. We propose to identify the key, underlying structure of the network by using provably good methods to summarize the network structure. One method is to reduce the graph into a tree structure, where the remaining influences are the most informative ones over all possible trees. Another method is to cluster neurons based on their functional connectivity, by identifying “equivalence classes” of neurons in the network that have similar input connections and similar output connections. We investigate these approaches for reducing the complexity of the network, while maintaining the most important structural components.

Abstract Author(s): Christopher Quinn, Todd Coleman, Negar Kiyavash, Nicholas G. Hatsopoulos, Carolyn Beck, Srinivasa M. Salapaka, Yunwen Xu