4.6 Article

Hypercomplex-valued recurrent correlation neural networks

期刊

NEUROCOMPUTING
卷 432, 期 -, 页码 111-123

出版社

ELSEVIER
DOI: 10.1016/j.neucom.2020.12.034

关键词

Recurrent neural network; Hypercomplex number system; Hopfield neural network; Associative memory

资金

  1. National Council for Scientific and Technological Development (CNPq) [310118/2017-4]
  2. Sao Paulo Research Foundation (FAPESP) [2019/02278-2]
  3. Coordenacao de Aperfeicoamento de Pessoal de Nivel Superior -Brasil (CAPES) [001]

向作者/读者索取更多资源

The researchers extended bipolar RCNNs to deal with hypercomplex-valued data and investigated the stability of these new networks. Examples were provided to illustrate the theoretical results and computational experiments confirmed the potential application of hypercomplex-valued RCNNs as associative memories for gray-scale images.
Recurrent correlation neural networks (RCNNs), introduced by Chiueh and Goodman as an improved version of the bipolar correlation-based Hopfield neural network, can be used to implement high-capacity associative memories. In this paper, we extend the bipolar RCNNs for processing hypercomplex-valued data. Precisely, we present the mathematical background for a broad class of hypercomplex-valued RCNNs. Then, we address the stability of the new hypercomplex-valued RCNNs using synchronous and asynchronous update modes. Examples with bipolar, complex, hyperbolic, quaternion, and octonionvalued RCNNs are given to illustrate the theoretical results. Finally, computational experiments confirm the potential application of hypercomplex-valued RCNNs as associative memories designed for the storage and recall of gray-scale images. (c) 2020 Elsevier B.V. All rights reserved.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据