4.7 Article

Protect, show, attend and tell: Empowering image captioning models with ownership protection

期刊

PATTERN RECOGNITION
卷 122, 期 -, 页码 -

出版社

ELSEVIER SCI LTD
DOI: 10.1016/j.patcog.2021.108285

关键词

Image captioning; Ownership protection; Deep neural network; Recurrent neural network; Long short-term memory

资金

  1. Fundamental Research Grant Scheme (FRGS) MoHE Grant [FP021-2018A]
  2. Ministry of Higher Education Malaysia

向作者/读者索取更多资源

This paper proposes two different embedding schemes in a recurrent neural network to protect image captioning models, which do not compromise the original performance and can withstand both removal and ambiguity attacks.
By and large, existing Intellectual Property (IP) protection on deep neural networks typically i) focus on image classification task only, and ii) follow a standard digital watermarking framework that was conventionally used to protect the ownership of multimedia and video content. This paper demonstrates that the current digital watermarking framework is insufficient to protect image captioning tasks that are often regarded as one of the frontiers AI problems. As a remedy, this paper studies and proposes two different embedding schemes in the hidden memory state of a recurrent neural network to protect the image captioning model. From empirical points, we prove that a forged key will yield an unusable image captioning model, defeating the purpose of infringement. To the best of our knowledge, this work is the first to propose ownership protection on image captioning task. Also, extensive experiments show that the proposed method does not compromise the original image captioning performance on all common captioning metrics on Flickr30k and MS-COCO datasets, and at the same time it is able to withstand both removal and ambiguity attacks. Code is available at https://github.com/jianhanlim/ipr-imagecaptioning (c) 2021 Elsevier Ltd. All rights reserved.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据