4.1 Article

Exceptional Reducibility of Complex-Valued Neural Networks

Journal

IEEE TRANSACTIONS ON NEURAL NETWORKS
Volume 21, Issue 7, Pages 1060-1072

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TNN.2010.2048040

Keywords

Complex-valued neural networks; minimality; reducibility; rotation-equivalence

Ask authors/readers for more resources

A neural network is referred to as minimal if it cannot reduce the number of hidden neurons that maintain the input-output map. The condition in which the number of hidden neurons can be reduced is referred to as reducibility. Real-valued neural networks have only three simple types of reducibility. It can be naturally extended to complex-valued neural networks without bias terms of hidden neurons. However, general complex-valued neural networks have another type of reducibility, referred to herein as exceptional reducibility. In this paper, another type of reducibility is presented, and a method by which to minimize complex-valued neural networks is proposed.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.1
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available