4.7 Article

Towards a physical-world adversarial patch for blinding object detection models

Journal

INFORMATION SCIENCES
Volume 556, Issue -, Pages 459-471

Publisher

ELSEVIER SCIENCE INC
DOI: 10.1016/j.ins.2020.08.087

Keywords

Adversarial patch; Adversarial attack; Object detection model; Deep neural network

Funding

  1. National Natural Science Foundation of China [61876019, 61772070]

Ask authors/readers for more resources

The paper introduces a novel adversarial patch attack that makes specific objects invisible to object detection models, demonstrating high transferability across different architectures and datasets. Additionally, the attack successfully fools several state-of-the-art object detection models and illustrates vulnerability in both digital and physical worlds.
As one of the core components of the computer vision, the object detection model plays a vital role in various security-sensitive systems. However, it has been proved that the object detection model is vulnerable to the adversarial attack. In this paper, we propose a novel adversarial patch attack against object detection models. Our attack can make the object of a specific class invisible to object detection models. We design the detection score to measure the detection model's output and generate the adversarial patch by minimizing the detection score. We successfully suppress the model's inference and fool several state-of-the-art object detection models. We triumphantly achieve a minimum recall of 11.02% and a maximum fooling rate of 81.00% and demonstrates the high transferability of adversarial patch between different architecture and datasets. Finally, we successfully fool a real-time object detection system in the physical world, demonstrating the feasibility of transferring the digital adversarial patch to the physical world. Our work illustrates the vulnerability of the object detection model against the adversarial patch attack in both the digital and physical world. (C) 2020 Elsevier Inc. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available