4.7 Article

Feature Consistency Training With JPEG Compressed Images

Journal

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TCSVT.2019.2959815

Keywords

Image coding; Distortion; Training; Transform coding; Robustness; Quantization (signal); Feature extraction; Compression artifacts; deep neural network; JPEG compression; classification robustness

Funding

  1. MOST of Taiwan [105-2218-E-009-001/107-2221-E-009-125-MY3]
  2. National Science Foundation of USA [DMS-1721550/DMS-1811920]

Ask authors/readers for more resources

Deep neural networks (DNNs) are recently found to be vulnerable to JPEG compression artifacts, which distort the feature representations of DNNs leading to serious accuracy degradation. Most existing training methods which aim to address this problem add compressed images to the training data to enhance the robustness of DNNs. However, their improvements are limited since these methods usually regard the compressed images as new training samples instead of distorted samples. The feature distortions between the raw images and the compressed images are not investigated. In this work, we propose a new training method, called Feature Consistency Training, that is designed to minimize the feature distortions caused by JPEG artifacts. At each training iteration, we simultaneously input a raw image and its compressed version with a randomly sampled quality into a DNN model and extract the features from the internal layers. By adding feature consistency constraint to the objective function, the feature distortions in the representation space are minimized in order to learn robust filters. Besides, we present a residual mapping block which takes the quality factor of the compressed image as an additional information to further reduce the feature distortion. Extensive experiments demonstrate that our method outperforms several existed training methods on JPEG compressed images. Furthermore, DNN models trained by our method are found to be more robust to unseen distortions.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available