Conditional finite mixtures of poisson distributions for context-dependent neural correlations

Sacha Sokoloski, Ruben Coen-Cagli

Research output: Contribution to journalArticlepeer-review

Abstract

Parallel recordings of neural spike counts have revealed the existence of context-dependent noise correlations in neural populations. Theories of population coding have also shown that such correlations can impact the information encoded by neural populations about external stimuli. Although studies have shown that these correlations often have a low-dimensional structure, it has proven difficult to capture this structure in a model that is compatible with theories of rate coding in correlated populations. To address this difficulty we develop a novel model based on conditional finite mixtures of independent Poisson distributions. The model can be conditioned on context variables (e.g. stimuli or task variables), and the number of mixture components in the model can be cross-validated to estimate the dimensionality of the target correlations. We derive an expectation-maximization algorithm to efficiently fit the model to realistic amounts of data from large neural populations. We then demonstrate that the model successfully captures stimulus-dependent correlations in the responses of macaque V1 neurons to oriented gratings. Our model incorporates arbitrary nonlinear context-dependence, and can thus be applied to improve predictions of neural activity based on deep neural networks.

Original languageEnglish (US)
JournalUnknown Journal
StatePublished - Aug 1 2019

ASJC Scopus subject areas

  • General

Fingerprint Dive into the research topics of 'Conditional finite mixtures of poisson distributions for context-dependent neural correlations'. Together they form a unique fingerprint.

Cite this