4.5 Article

Sleeping Our Way to Weight Normalization and Stable Learning

Journal

NEURAL COMPUTATION
Volume 20, Issue 12, Pages 3111-3130

Publisher

MIT PRESS
DOI: 10.1162/neco.2008.04-07-502

Keywords

-

Funding

  1. National Science Foundation [0133996]
  2. NSF IGERT [DGE-0333451]
  3. Div Of Information & Intelligent Systems
  4. Direct For Computer & Info Scie & Enginr [0133996] Funding Source: National Science Foundation

Ask authors/readers for more resources

The functions of sleep have been an enduring mystery. Tononi and Cirelli (2003) hypothesized that one of the functions of slow-wave sleep is to scale down synapses in the cortex that have strengthened during awake learning. We create a computational model to test the functionality of this idea and examine some of its implications. We show that synaptic scaling during slow-wave sleep is capable of keeping Hebbian learning in check and that it enables stable development. We also show theoretically how it implements classical weight normalization, which has been in common use in neural models for decades. Finally, a significant computational limitation of this form of synaptic scaling is revealed through computer simulations.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.5
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available