期刊
NEUROCOMPUTING
卷 422, 期 -, 页码 95-108出版社
ELSEVIER
DOI: 10.1016/j.neucom.2020.09.005
关键词
Deep neural networks; Convolutional neural networks; Convolution kernel
资金
- National Natural Science Foundation of China [61673322, 61673326, 91746103]
- Fundamental Research Funds for the Central Universities [20720190142]
- European Union's Horizon 2020 research and innovation programme under the Marie Sklodowska-Curie grant [663830]
This paper introduces dynamic kernel convolutional neural networks (DK-CNNs) and explains how they enhance the expressive capacity of convolutional operations by extending a latent dimension. DK convolution analyzes fixed features with a latent variable, leading to better performance compared to regular CNNs.
This paper introduces dynamic kernel convolutional neural networks (DK-CNNs), an enhanced type of CNN, by performing line-by-line scanning regular convolution to generate a latent dimension of kernel weights. The proposed DK-CNN applies regular convolution to the DK weights, which rely on a latent variable, and discretizes the space of the latent variable to extend a new dimension; this process is named DK convolution. DK convolution increases the expressive capacity of the convolution operation without increasing the number of parameters by searching for useful patterns within the new extended dimen-sion. In contrast to conventional convolution, which applies a fixed kernel to analyse the changed features, DK convolution employs a DK to analyse fixed features. In addition, DK convolution can replace a standard convolution layer in any CNN network structure. The proposed DK-CNNs were compared with different network structures with and without a latent dimension on the CIFAR and FashionMNIST data sets. The experimental results show that DK-CNNs can achieve better performance than regular CNNs. (c) 2020 Elsevier B.V. All rights reserved.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据