4.7 Article

Distinguishing between fake news and satire with transformers

Journal

EXPERT SYSTEMS WITH APPLICATIONS
Volume 187, Issue -, Pages -

Publisher

PERGAMON-ELSEVIER SCIENCE LTD
DOI: 10.1016/j.eswa.2021.115824

Keywords

Fake news; Satire; Sarcasm; Deep learning; Transformers; BERT; DistilBERT; Classification

Funding

  1. Natural Sciences and Engineering Research Council of Canada (NSERC) [RGPIN2018-03872]
  2. Canada Research Chairs Program [950230623]
  3. Cluster project from Zayed University, United Arab Emirates [R16083]
  4. Zayed University, United Arab Emirates [R20093]

Ask authors/readers for more resources

The study demonstrates that applying transformers neural network architecture to distinguish between satirical news and fake news yields better results compared to traditional machine-learning methods. Further improvements in model performance can be achieved through the use of non-standard tokenization schemes and different pre-training strategies.
Indiscriminate elimination of harmful fake news risks destroying satirical news, which can be benign or even beneficial, because both types of news share highly similar textual cues. In this work we applied a recent development in neural network architecture, transformers, to the task of separating satirical news from fake news. Transformers have hitherto not been applied to this specific problem. Our evaluation results on a publicly available and carefully curated dataset show that the performance from a classifier framework built around a DistilBERT architecture performed better than existing machine-learning approaches. Additional improvement over baseline DistilBERT was achieved through the use of non-standard tokenization schemes as well as varying the pre-training and text pre-processing strategies. The improvement over existing approaches stands at 0.0429 (5.2%) in F1 and 0.0522 (6.4%) in accuracy. Further evaluation on two additional datasets shows our framework's ability to generalize across datasets without diminished performance.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available