4.7 Article

CGN: Class gradient network for the construction of adversarial samples

Journal

INFORMATION SCIENCES
Volume 654, Issue -, Pages -

Publisher

ELSEVIER SCIENCE INC
DOI: 10.1016/j.ins.2023.119855

Keywords

Adversarial samples; Class gradient matrix; Generator; Transferability

Ask authors/readers for more resources

This paper proposes a method based on class gradient networks for generating high-quality adversarial samples. By introducing a high-level class gradient matrix and combining classification loss and perturbation loss, the method demonstrates superiority in the transferability of adversarial samples on targeted attacks.
Deep neural networks (DNNs) have tremendously succeeded in several computer vision-related fields. Nevertheless, previous research demonstrates that DNNs are vulnerable to adversarial sample attacks. Attackers add carefully designed perturbation noise to clean samples to form adversarial samples, which may lead to errors in the DNNs' predictions. Consequently, the safety of deep learning has attracted much attention, and researchers have commenced exploring adversarial samples from different perspectives. In this paper, a method based on class gradient networks (CGN) is proposed, which can generate high-quality adversarial samples by designing multiple objective functions. Specifically, the adversarial sample's high-level features are guided to change by introducing a high-level class gradient matrix, and the classification loss and perturbation loss are combined to jointly train a generator to fit the distribution of adversarial noises. We conducted experiments on two standard datasets, Fashion-MNIST and CIFAR-10. The results demonstrate the superiority of our method in the transferability of adversarial samples on targeted attacks and indicate the approach outperforms the baseline method.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available