3.8 Proceedings Paper

Conditional BERT Contextual Augmentation

Journal

COMPUTATIONAL SCIENCE - ICCS 2019, PT IV
Volume 11539, Issue -, Pages 84-95

Publisher

SPRINGER INTERNATIONAL PUBLISHING AG
DOI: 10.1007/978-3-030-22747-0_7

Keywords

-

Funding

  1. National Key Research and Development Program of China [2017YFB1010000]
  2. National Natural Science Foundation of China [61702500]

Ask authors/readers for more resources

Data augmentation methods are often applied to prevent overfitting and improve generalization of deep neural network models. Recently proposed contextual augmentation augments labeled sentences by randomly replacing words with more varied substitutions predicted by language model. Bidirectional Encoder Representations from Transformers (BERT) demonstrates that a deep bidirectional language model is more powerful than either an unidirectional language model or the shallow concatenation of a forward and backward model. We propose a novel data augmentation method for labeled sentences called conditional BERT contextual augmentation. We retrofit BERT to conditional BERT by introducing a new conditional masked language model (The term conditional masked language model appeared once in original BERT paper, which indicates context-conditional, is equivalent to term masked language model. In our paper, conditional masked language model indicates we apply extra label-conditional constraint to the masked language model.) task. The well trained conditional BERT can be applied to enhance contextual augmentation. Experiments on six various different text classification tasks show that our method can be easily applied to both convolutional or recurrent neural networks classifier to obtain improvement.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

3.8
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available