Journal
FUTURE GENERATION COMPUTER SYSTEMS-THE INTERNATIONAL JOURNAL OF ESCIENCE
Volume 101, Issue -, Pages 152-168Publisher
ELSEVIER
DOI: 10.1016/j.future.2019.06.010
Keywords
Deep learning; Convolution; CNN; Residual network; Lightweight
Categories
Funding
- Deanship of Scientific Research, King Saud University, Riyadh, Saudi Arabia
Ask authors/readers for more resources
In this work, we present a new residual network family based on a tree structure. We present three types of tree modules that can be employed in different kinds of convolutional networks as a replacement for one or more convolutional layers. The new architecture exposes two new hyper-parameters - tree height and branching factor - that grant more control on the model size and codependency between the maps. Tree modules provide the flexibility to merge two important techniques: branching technique, which provides better feature representation, and group convolution, which reduces the number of parameters. Most previous studies focused on accuracy regardless of the model complexity. However, we focus on information density metric to design a model that effectively utilizes its parametric space. We conducted numerous experiments on the dataset of the Canadian Institute for Advanced Research-10 (CIFAR-10) and demonstrated that the proposed networks are superior to many famous networks in terms of information density and accuracy. (C) 2019 Elsevier B.V. All rights reserved.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available