4.7 Article

Copycat CNN: Are random non-Lab ele d data enough to steal knowledge from black -box models?

Journal

PATTERN RECOGNITION
Volume 113, Issue -, Pages -

Publisher

ELSEVIER SCI LTD
DOI: 10.1016/j.patcog.2021.107830

Keywords

Deep learning; Convolutional neural network; Neural network attack; Stealing network knowledge; Knowledge distillation

Funding

  1. Coordenacao de Aperfeicoamento de Pessoal de Nivel Superior - Brasil (CAPES) [001]
  2. Fundacao de Amparo a Pesquisa e Inovacao do Espirito Santo (FAPES) [594/2018]
  3. Conselho Nacional de Desenvolvimento Cientifico e Tecnologico (CNPq)

Ask authors/readers for more resources

Convolutional neural networks have been successful in enabling companies to develop neural-based products, but concerns arise about model security and copying. The authors proposed a method to copy black-box models and further consolidated and expanded the approach, demonstrating the effectiveness of natural random images in generating copycats for various problems.
Convolutional neural networks have been successful lately enabling companies to develop neural-based products, which demand an expensive process, involving data acquisition and annotation; and model generation, usually requiring experts. With all these costs, companies are concerned about the security of their models against copies and deliver them as black-boxes accessed by APIs. Nonetheless, we argue that even black-box models still have some vulnerabilities. In a preliminary work, we presented a simple, yet powerful, method to copy black-box models by querying them with natural random images. In this work, we consolidate and extend the copycat method: (i) some constraints are waived; (ii) an extensive evaluation with several problems is performed; (iii) models are copied between different architectures; and, (iv) a deeper analysis is performed by looking at the copycat behavior. Results show that natural random images are effective to generate copycats for several problems. (c) 2021 Elsevier Ltd. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available