4.7 Article

Heterogeneous Multilayer Generalized Operational Perceptron

Journal

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TNNLS.2019.2914082

Keywords

Neurons; Biological neural networks; Network topology; Learning systems; Topology; Computational modeling; Nonhomogeneous media; Architecture learning; feedforward network; generalized operational perceptron (GOP); progressive learning

Funding

  1. Academy of Finland [289364]
  2. Academy of Finland (AKA) [289364, 289364] Funding Source: Academy of Finland (AKA)

Ask authors/readers for more resources

The traditional multilayer perceptron (MLP) using a McCulloch-Pitts neuron model is inherently limited to a set of neuronal activities, i.e., linear weighted sum followed by nonlinear thresholding step. Previously, generalized operational perceptron (GOP) was proposed to extend the conventional perceptron model by defining a diverse set of neuronal activities to imitate a generalized model of biological neurons. Together with GOP, a progressive operational perceptron (POP) algorithm was proposed to optimize a predefined template of multiple homogeneous layers in a layerwise manner. In this paper, we propose an efficient algorithm to learn a compact, fully heterogeneous multilayer network that allows each individual neuron, regardless of the layer, to have distinct characteristics. Based on the complexity of the problem, the proposed algorithm operates in a progressive manner on a neuronal level, searching for a compact topology, not only in terms of depth but also width, i.e., the number of neurons in each layer. The proposed algorithm is shown to outperform other related learning methods in extensive experiments on several classification problems.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available