4.7 Article

Long Text QA Matching Model Based on BiGRU-DAttention-DSSM

Journal

MATHEMATICS
Volume 9, Issue 10, Pages -

Publisher

MDPI
DOI: 10.3390/math9101129

Keywords

QA matching; long text; DSSM; BiGRU– Dattention– DSSM

Categories

Funding

  1. GDUFS Laboratory of Language Engineering and Computing Bidding Item [LEC2018ZBKT002, LEC2020ZBKT002]
  2. National Natural Science Foundation of China [61772146]
  3. Science and Technology Planning Project of Guangzhou [202002030239]

Ask authors/readers for more resources

This paper focuses on researching long text QA-matching technology using deep learning, including adjusting and improving DSSM. Experimental results demonstrate that BiGRU-Dattention-DSSM performs better at matching questions and answers.
QA matching is a very important task in natural language processing, but current research on text matching focuses more on short text matching rather than long text matching. Compared with short text matching, long text matching is rich in information, but distracting information is frequent. This paper extracted question-and-answer pairs about psychological counseling to research long text QA-matching technology based on deep learning. We adjusted DSSM (Deep Structured Semantic Model) to make it suitable for the QA-matching task. Moreover, for better extraction of long text features, we also improved DSSM by enriching the text representation layer, using a bidirectional neural network and attention mechanism. The experimental results show that BiGRU-Dattention-DSSM performs better at matching questions and answers.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available