Journal
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS
Volume 30, Issue 8, Pages 2295-2309Publisher
IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TNNLS.2018.2881143
Keywords
Convolutional autoencoder (CAE); deep learning; image classification; neural networks; particle swarm optimization (PSO)
Categories
Funding
- Marsden Fund of New Zealand Government [VUW1209, VUW1509, VUW1615]
- Huawei Industry Fund [E2880/3663]
- University Research Fund of Victoria University of Wellington [209862/3580, 213150/3662]
- National Natural Science Fund of China for Distinguished Young Scholar [61625204]
- National Natural Science Foundation of China [61803277]
Ask authors/readers for more resources
Convolutional autoencoders (CAEs) have shown their remarkable performance in stacking to deep convolutional neural networks (CNNs) for classifying image data during the past several years. However, they are unable to construct the state-of-the-art CNNs due to their intrinsic architectures. In this regard, we propose a flexible CAE (FCAE) by eliminating the constraints on the numbers of convolutional layers and pooling layers from the traditional CAE. We also design an architecture discovery method by exploiting particle swarm optimization, which is capable of automatically searching for the optimal architectures of the proposed FCAE with much less computational resource and without any manual intervention. We test the proposed approach on four extensively used image classification data sets. Experimental results show that our proposed approach in this paper significantly outperforms the peer competitors including the state-of-the-art algorithms.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available