4.7 Article

MaskLayer: Enabling scalable deep learning solutions by training embedded feature sets

期刊

NEURAL NETWORKS
卷 137, 期 -, 页码 43-53

出版社

PERGAMON-ELSEVIER SCIENCE LTD
DOI: 10.1016/j.neunet.2021.01.015

关键词

Quality scalable; Scalability; Deep learning; Point clouds; Compression; Semantic hashing

资金

  1. Research Foundation - Flanders (FWO) , Belgium [1S89420N]

向作者/读者索取更多资源

This study introduces a novel MaskLayer neural network layer and proposes a masked optimizer and balancing gradient rescaling approach to achieve quality scalability within the deep learning framework. Experimental results show that the cost of introducing scalability with MaskLayer remains limited.
Deep learning-based methods have shown to achieve excellent results in a variety of domains, however, some important assets are absent. Quality scalability is one of them. In this work, we introduce a novel and generic neural network layer, named MaskLayer. It can be integrated in any feedforward network, allowing quality scalability by design by creating embedded feature sets. These are obtained by imposing a specific structure of the feature vector during training. To further improve the performance, a masked optimizer and a balancing gradient rescaling approach are proposed. Our experiments show that the cost of introducing scalability using MaskLayer remains limited. In order to prove its generality and applicability, we integrated the proposed techniques in existing, non-scalable networks for point cloud compression and semantic hashing with excellent results. To the best of our knowledge, this is the first work presenting a generic solution able to achieve quality scalable results within the deep learning framework. (C) 2021 Elsevier Ltd. All rights reserved.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据