4.7 Article

Deep neural-kernel blocks

Journal

NEURAL NETWORKS
Volume 116, Issue -, Pages 46-55

Publisher

PERGAMON-ELSEVIER SCIENCE LTD
DOI: 10.1016/j.neunet.2019.03.011

Keywords

Deep learning; Neural networks; Kernel methods; Pooling layer; Competitive learning; Dimensionality reduction

Funding

  1. Postdoctoral Fellowship of the Research Foundation-Flanders [FWO: 12Z1318N]

Ask authors/readers for more resources

This paper introduces novel deep architectures using the hybrid neural-kernel core model as the first building block. The proposed models follow a combination of a neural networks based architecture and a kernel based model enriched with pooling layers. In particular, in this context three kernel blocks with average, maxout and convolutional pooling layers are introduced and examined. We start with a simple merging layer which averages the output of the previous representation layers. The maxout layer on the other hand triggers competition among different representations of the input. Thanks to this pooling layer, not only the dimensionality of the output of multi-scale representations is reduced but also multiple sub-networks are formed within the same model. In the same context, the pointwise convolutional layer is also employed with the aim of projecting the multi-scale representations onto a new space. Experimental results show an improvement over the core deep hybrid model as well as kernel based models on several real-life datasets. (C) 2019 Elsevier Ltd. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available