4.3 Article

Gradient descent learning rule for complex-valued associative memories with large constant terms

出版社

WILEY
DOI: 10.1002/tee.22225

关键词

complex-valued neural networks; associative memory; noise tolerance; learning algorithm

向作者/读者索取更多资源

Complex-valued associative memories (CAMs) are one of the most promising associative memory models by neural networks. However, the low noise tolerance of CAMs is often a serious problem. A projection learning rule with large constant terms improves the noise tolerance of CAMs. However, the projection learning rule can be applied only to CAMs with full connections. In this paper, we propose a gradient descent learning rule with large constant terms, which is not restricted by network topology. We realize large constant terms by regularization to connection weights. By computer simulations, we prove that the proposed learning algorithm improves noise tolerance. (c) 2016 Institute of Electrical Engineers of Japan. Published by John Wiley & Sons, Inc.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.3
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据