4.8 Article

Low-dimensional dynamics for working memory and time encoding

Publisher

NATL ACAD SCIENCES
DOI: 10.1073/pnas.1915984117

Keywords

neural dynamics; working memory; time decoding; recurrent networks; reservoir computing

Funding

  1. NSF Next Generation Network for Neuroscience Award [DBI-1707398]
  2. Gatsby Charitable Foundation
  3. Simons Foundation
  4. Swartz Foundation
  5. NIH Training Grant [5T32NS064929]
  6. Kavli Foundation
  7. National Institute of Neurological Disorders and Stroke Brain Initiative Grant [R01NS113113]
  8. Direccion General de Asuntos del Personal Academico de la Universidad Nacional Autonoma de Mexico [PAPIIT-IN210819]
  9. Consejo Nacional de Ciencia y Tecnologia [CONACYT-240892]
  10. National Institute of Mental Health Division of Intramural Research Grant [Z01MH-01092]
  11. Italian Fondo per gli investimenti della ricerca di base 2010 Grant [RBFR10G5W9 001]

Ask authors/readers for more resources

Our decisions often depend on multiple sensory experiences separated by time delays. The brain can remember these experiences and, simultaneously, estimate the timing between events. To understand the mechanisms underlying working memory and time encoding, we analyze neural activity recorded during delays in four experiments on nonhuman primates. To disambiguate potential mechanisms, we propose two analyses, namely, decoding the passage of time from neural data and computing the cumulative dimensionality of the neural trajectory over time. Time can be decoded with high precision in tasks where timing information is relevant and with lower precision when irrelevant for performing the task. Neural trajectories are always observed to be low-dimensional. In addition, our results further constrain the mechanisms underlying time encoding as we find that the linear ramping component of each neuron's firing rate strongly contributes to the slow timescale variations that make decoding time possible. These constraints rule out working memory models that rely on constant, sustained activity and neural networks with high-dimensional trajectories, like reservoir networks. Instead, recurrent networks trained with backpropagation capture the time-encoding properties and the dimensionality observed in the data.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.8
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available