4.7 Article

Physical Adversarial Attacks for Surveillance: A Survey

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TNNLS.2023.3321432

Keywords

Adversarial defense; adversarial vulnerability; counter biometric surveillance; physical adversarial attacks; safety and security; surveillance systems

Ask authors/readers for more resources

Modern automated surveillance techniques rely on deep learning methods, but these methods are susceptible to adversarial attacks. Attackers can bypass detection and recognition of surveillance systems by altering their appearance or behavior, posing a threat to security. This article reviews recent attempts and findings in physical adversarial attacks on surveillance systems, and proposes strategies for defense and evaluation.
Modern automated surveillance techniques are heavily reliant on deep learning methods. Despite the superior performance, these learning systems are inherently vulnerable to adversarial attacks-maliciously crafted inputs that are designed to mislead, or trick, models into making incorrect predictions. An adversary can physically change their appearance by wearing adversarial t-shirts, glasses, or hats or by specific behavior, to potentially avoid various forms of detection, tracking, and recognition of surveillance systems; and obtain unauthorized access to secure properties and assets. This poses a severe threat to the security and safety of modern surveillance systems. This article reviews recent attempts and findings in learning and designing physical adversarial attacks for surveillance applications. In particular, we propose a framework to analyze physical adversarial attacks and provide a comprehensive survey of physical adversarial attacks on four key surveillance tasks: detection, identification, tracking, and action recognition under this framework. Furthermore, we review and analyze strategies to defend against physical adversarial attacks and the methods for evaluating the strengths of the defense. The insights in this article present an important step in building resilience within surveillance systems to physical adversarial attacks.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available