4.6 Article

Efficient training for dendrite morphological neural networks

Journal

NEUROCOMPUTING
Volume 131, Issue -, Pages 132-142

Publisher

ELSEVIER
DOI: 10.1016/j.neucom.2013.10.031

Keywords

Dendrite morphological neural network; Efficient training; Pattern recognition; Classification

Funding

  1. SIP-IPN
  2. CONACYT
  3. ICYTDF [20121311, 20131182, 15014, 325/2011]

Ask authors/readers for more resources

This paper introduces an efficient training algorithm for a dendrite morphological neural network (DMNN). Given p classes of patterns, C-k, k=1, 2, p, the algorithm selects the patterns of all the classes and opens a hyper-cube HCn (with n dimensions) with a size such that all the class elements remain inside HCn. The size of HCn can be chosen such that the border elements remain in some of the faces of HCn, or can be chosen for a bigger size. This last selection allows the trained DMNN to be a very efficient classification machine in the presence of noise at the moment of testing, as we will see later. In a second step, the algorithm divides the HCn into 2(n) smaller hyper-cubes and verifies if each hyper-cube encloses patterns for only one class. If this is the case, the learning process is stopped and the DMNN is designed. If at least one hyper-cube HCn encloses patterns of more than one class, then HCn is divided into 2(n) smaller hyper-cubes. The verification process is iteratively repeated onto each smaller hyper-cube until the stopping criterion is satisfied. At this moment the DMNN is designed. The algorithm was tested for benchmark problems and compare its performance against some reported algorithms, showing its superiority. (C) 2013 Elsevier B.V. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available