4.7 Article

Feature Consistency Training With JPEG Compressed Images

出版社

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TCSVT.2019.2959815

关键词

Image coding; Distortion; Training; Transform coding; Robustness; Quantization (signal); Feature extraction; Compression artifacts; deep neural network; JPEG compression; classification robustness

资金

  1. MOST of Taiwan [105-2218-E-009-001/107-2221-E-009-125-MY3]
  2. National Science Foundation of USA [DMS-1721550/DMS-1811920]

向作者/读者索取更多资源

Deep neural networks (DNNs) are recently found to be vulnerable to JPEG compression artifacts, which distort the feature representations of DNNs leading to serious accuracy degradation. Most existing training methods which aim to address this problem add compressed images to the training data to enhance the robustness of DNNs. However, their improvements are limited since these methods usually regard the compressed images as new training samples instead of distorted samples. The feature distortions between the raw images and the compressed images are not investigated. In this work, we propose a new training method, called Feature Consistency Training, that is designed to minimize the feature distortions caused by JPEG artifacts. At each training iteration, we simultaneously input a raw image and its compressed version with a randomly sampled quality into a DNN model and extract the features from the internal layers. By adding feature consistency constraint to the objective function, the feature distortions in the representation space are minimized in order to learn robust filters. Besides, we present a residual mapping block which takes the quality factor of the compressed image as an additional information to further reduce the feature distortion. Extensive experiments demonstrate that our method outperforms several existed training methods on JPEG compressed images. Furthermore, DNN models trained by our method are found to be more robust to unseen distortions.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据