4.7 Article

Synaptic Scaling--An Artificial Neural Network Regularization Inspired by Nature

出版社

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TNNLS.2021.3050422

关键词

Mutual information; Entropy; Biological neural networks; Training; Neurons; Task analysis; Topology; Computational neuroscience; homeostatic plasticity; information bottleneck; mutual information; neural network; regularization; synaptic scaling

资金

  1. German Federal Ministry for the Environment, Nature Conservation, Building and Nuclear Safety (BMUB) [3514685C19, 3519685A08, 67KI2086A]
  2. German Ministry of Education and Research (BMBF) [01LC1319, 01IS20062, 16PGF0304]
  3. Stiftung Naturschutz Thuringen (SNT) [SNT-082-248-03/2014]
  4. Friedrich Naumann Stiftung fur die Freiheit PhD scholarship [ST7847/P622]
  5. Nvidia GPU Grant
  6. Thuringian Ministry for Environment, Energy and Nature Conservation [68678]

向作者/读者索取更多资源

Nature has inspired scientists to develop new methods based on observations, with recent advances allowing insights into biological neural processes. Homeostatic plasticity, particularly synaptic scaling, has been identified as a mature and applicable theory to enhance learning capabilities of neural networks. Analyzing mutual information affected by synaptic scaling, the proposed approach outperforms previous regularization techniques in regression and classification tasks across various network topologies and data sets.
Nature has always inspired the human spirit and scientists frequently developed new methods based on observations from nature. Recent advances in imaging and sensing technology allow fascinating insights into biological neural processes. With the objective of finding new strategies to enhance the learning capabilities of neural networks, we focus on a phenomenon that is closely related to learning tasks and neural stability in biological neural networks, called homeostatic plasticity. Among the theories that have been developed to describe homeostatic plasticity, synaptic scaling has been found to be the most mature and applicable. We systematically discuss previous studies on the synaptic scaling theory and how they could be applied to artificial neural networks. Therefore, we utilize information theory to analytically evaluate how mutual information is affected by synaptic scaling. Based on these analytic findings, we propose two flavors in which synaptic scaling can be applied in the training process of simple and complex, feedforward, and recurrent neural networks. We compare our approach with state-of-the-art regularization techniques on standard benchmarks. We found that the proposed method yields the lowest error in both regression and classification tasks compared to previous regularization approaches in our experiments across a wide range of network feedforward and recurrent topologies and data sets.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据