4.7 Article

Data Augmentation Using Random Image Cropping and Patching for Deep CNNs

Journal

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TCSVT.2019.2935128

Keywords

Training; Task analysis; Image color analysis; Principal component analysis; Convolutional neural networks; Jitter; Standards; Data augmentation; image classification; convolutional neural network; image-caption retrieval

Funding

  1. Strategic Information and Communications R&D Promotion Programme of Ministry of Internal Affairs and Communications, Japan (MIC/SCOPE) [172107101]

Ask authors/readers for more resources

Deep convolutional neural networks (CNNs) have achieved remarkable results in image processing tasks. However, their high expression ability risks overfitting. Consequently, data augmentation techniques have been proposed to prevent overfitting while enriching datasets. Recent CNN architectures with more parameters are rendering traditional data augmentation techniques insufficient. In this study, we propose a new data augmentation technique called random image cropping and patching (RICAP) which randomly crops four images and patches them to create a new training image. Moreover, RICAP mixes the class labels of the four images, resulting in an advantage of the soft labels. We evaluated RICAP with current state-of-the-art CNNs (e.g., the shake-shake regularization model) by comparison with competitive data augmentation techniques such as cutout and mixup. RICAP achieves a new state-of-the-art test error of 2.19% on CIFAR-10. We also confirmed that deep CNNs with RICAP achieve better results on classification tasks using CIFAR-100 and ImageNet, an image-caption retrieval task using Microsoft COCO, and other computer vision tasks.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available