4.7 Article

Topic-level knowledge sub-graphs for multi-turn dialogue generation

Journal

KNOWLEDGE-BASED SYSTEMS
Volume 234, Issue -, Pages -

Publisher

ELSEVIER
DOI: 10.1016/j.knosys.2021.107499

Keywords

Knowledge-based dialogue system; Multi-turn dialogue generation; Knowledge graph; Topic-level

Funding

  1. National Natural Science Foundation of China [62076100]
  2. Fundamental Research Funds for the Central Universities, China, SCUT [D2210010, D2200150, D2201300]
  3. Science and Technology Planning Project of Guangdong Province, China [2020B0101100002]
  4. Science and Technology Programs of Guangzhou, China [201704030076, 201707010223, 201802010027, 201902010046]
  5. Science and Technology Key Projects of Guangxi Province, China [2020AA21077007]
  6. Hong Kong Research Grants Council, China [PolyU1121417, C1031-18G]
  7. Hong Kong Polytechnic University, China [1.9B0V]

Ask authors/readers for more resources

A Topic-level Knowledge aware Dialogue Generation model is proposed to capture context-aware topic-level knowledge information and enhance the topic-coherence, fluency, and diversity of generated responses. By decomposing the Knowledge Graph into topic-level sub-graphs and using a Topic-level Sub-graphs Attention Network, the model outperforms existing strong baselines in experiments on DuRecDial and KdConv datasets.
Previous multi-turn dialogue approaches based on global Knowledge Graphs (KGs) still suffer from generic, uncontrollable, and incoherent responses generation. Most of them neither consider the local topic-level semantic information of KGs nor effectively merge the information of long dialogue contexts and KGs into the dialogue generation. To tackle these issues, we propose a Topic-level Knowledge aware Dialogue Generation model to capture context-aware topic-level knowledge information. Our method thus accounts for topic-coherence, fluency, and diversity of generated responses. Specifically, we first decompose the given KG into a set of topic-level sub-graphs, with each sub-graph capturing a semantic component of the input KG. Furthermore, we design a Topic-level Sub-graphs Attention Network to calculate the comprehensive representation of both sub-graphs and previous turns of dialogue utterances, which then decoded with the current turn into a response. By using sub-graphs, our model is able to attend to different topical components of the KG and enhance the topic-coherence. We perform extensive experiments on two datasets of DuRecDial and KdConv to demonstrate the effectiveness of our model. The experimental results demonstrate that our model outperforms existing strong baselines. (C) 2021 Elsevier B.V. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available