4.3 Article

Twisted Quaternary Neural Networks

Journal

Publisher

WILEY
DOI: 10.1002/tee.21746

Keywords

neural networks; quaternion; learning; backpropagation algorithm

Ask authors/readers for more resources

The quaternary neural network (QNN) proposed by Nitta is a high-dimensional neural network. Nitta showed that its learning is faster than that of ordinary neural networks and the number of required parameters is almost one-third of that of real-valued neural networks by computer simulations. In this paper, we propose the twisted quaternary neural network (TQNN) which modifies the directions of multiplications of the QNN. Since quaternions are noncommutative on multiplication, we can get another neural network. When the activation function is linear, multilayered neural networks can be expressed by single-layered neural networks. But the TQNN cannot be expressed by a single-layered QNN even if the activation function is linear. Therefore, the TQNN is expected to produce a variety of signal-processing systems. We performed computer simulations to compare the QNN and the TQNN. Then we found that the latter's learning is a little faster. Moreover, computer simulation showed that the QNN tended to be trapped in local minima or plateaus but the TQNN did not. It is said that reducibility causes local minima and plateaus. We discuss the differences of reducibility between the QNN and the TQNN as well. (c) 2012 Institute of Electrical Engineers of Japan. Published by John Wiley & Sons, Inc.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.3
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available