Journal
INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS
Volume 13, Issue 4, Pages 1065-1077Publisher
SPRINGER HEIDELBERG
DOI: 10.1007/s13042-021-01435-0
Keywords
Natural adversarial; Adversarial examples; Trustworthy machine learning; Computer vision
Categories
Funding
- Spanish Ministry of Economy and Business [SBPLY/17/180501/000543]
- Autonomous Government of Castilla-La Mancha
- Spanish Ministry of Science, Innovation, and Universities [FPU17/04758]
Ask authors/readers for more resources
The phenomenon of Adversarial Examples, where deep neural networks can be fooled by imperceptible perturbations, exists in the real world without maliciously selected noise. Through comparisons using distance and image quality metrics, it was shown that natural adversarial examples have a greater distance from the originals compared to artificially generated ones.
The phenomenon of Adversarial Examples has become one of the most intriguing topics associated to deep learning. The so-called adversarial attacks have the ability to fool deep neural networks with inappreciable perturbations. While the effect is striking, it has been suggested that such carefully selected injected noise does not necessarily appear in real-world scenarios. In contrast to this, some authors have looked for ways to generate adversarial noise in physical scenarios (traffic signs, shirts, etc.), thus showing that attackers can indeed fool the networks. In this paper we go beyond that and show that adversarial examples also appear in the real-world without any attacker or maliciously selected noise involved. We show this by using images from tasks related to microscopy and also general object recognition with the well-known ImageNet dataset. A comparison between these natural and the artificially generated adversarial examples is performed using distance metrics and image quality metrics. We also show that the natural adversarial examples are in fact at a higher distance from the originals that in the case of artificially generated adversarial examples.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available