4.8 Article

The survey: Text generation models in deep learning

Publisher

ELSEVIER
DOI: 10.1016/j.jksuci.2020.04.001

Keywords

Natural Language Processing (NLP); Deep learning; Word embeddings; Recurrent Neural Networks (RNNs); Convolutional Neural Networks (CNNs); Variational Auto-Encoders (VAEs); Generative Adversarial Networks (GANs); Text generation techniques; Activation functions; Optimization techniques

Ask authors/readers for more resources

This article presents the recent advances in deep generative modeling, focusing on the application of deep learning in natural language processing and the development and future directions of text generation models.
Deep learning methods possess many processing layers to understand the stratified representation of data and have achieved state-of-art results in several domains. Recently, deep learning model designs and architectures have unfolded in the context of Natural Language Processing (NLP). This survey presents a brief description of the advances that have occurred in the area of Deep Generative modeling. This work considers most of the papers from 2015 onwards. In this paper, we review many deep learning models that have been used for the generation of text. We also summarize the various models and have put forward the detailed understanding of past, present, and future of text generation models in deep learning. Furthermore, DL approaches that have been explored and evaluated in different application domains in NLP are included in this survey. (C) 2020 The Authors. Published by Elsevier B.V. on behalf of King Saud University.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.8
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available