4.1 Article

Large Memory Capacity in Chaotic Artificial Neural Networks: A View of the Anti-Integrable Limit

期刊

IEEE TRANSACTIONS ON NEURAL NETWORKS
卷 20, 期 8, 页码 1340-1351

出版社

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TNN.2009.2024148

关键词

Anti-integrable limit; artificial neural network; chaos; periodic activation function

资金

  1. National Natural Science Foundation of China [10501008, 60874121]
  2. Rising-Star Program Foundation of Shanghai, China [07QA14002]
  3. National Basic Research Program of China [2006CB303102, 2007CB814904]
  4. Hong Kong Research Grants Council [CityU1117/08E]

向作者/读者索取更多资源

In the literature, it was reported that the chaotic artificial neural network model with sinusoidal activation functions possesses a large memory capacity as well as a remarkable ability of retrieving the stored patterns, better than the conventional chaotic model with only monotonic activation functions such as sigmoidal functions. This paper, from the viewpoint of the anti-integrable limit, elucidates the mechanism inducing the superiority of the model with periodic activation functions that includes sinusoidal functions. Particularly, by virtue of the anti-integrable limit technique, this paper shows that any finite-dimensional neural network model with periodic activation functions and properly selected parameters has much more abundant chaotic dynamics that truly determine the model's memory capacity and pattern-retrieval ability. To some extent, this paper mathematically and numerically demonstrates that an appropriate choice of the activation functions and control scheme can lead to a large memory capacity and better pattern-retrieval ability of the artificial neural network models.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.1
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据