Dynamically Retrieving Knowledge via Query Generation for Informative Dialogue Generation

30 Jul 2022  ·  Zhongtian Hu, Lifang Wang, Yangqi Chen, Yushuang Liu, Ronghan Li, Meng Zhao, Xinyu Lu, Zejun Jiang ·

Knowledge-driven dialog system has recently made remarkable breakthroughs. Compared with general dialog systems, superior knowledge-driven dialog systems can generate more informative and knowledgeable responses with pre-provided knowledge. However, in practical applications, the dialog system cannot be provided with corresponding knowledge in advance because it cannot know in advance the development of the conversation. Therefore, in order to make the knowledge dialogue system more practical, it is vital to find a way to retrieve relevant knowledge based on the dialogue history. To solve this problem, we design a knowledge-driven dialog system named DRKQG (Dynamically Retrieving Knowledge via Query Generation for informative dialog response). Specifically, the system can be divided into two modules: the query generation module and the dialog generation module. First, a time-aware mechanism is utilized to capture context information, and a query can be generated for retrieving knowledge through search engine. Then, we integrate the copy mechanism and transformers, which allows the response generation module to produce responses derived from the context and retrieved knowledge. Experimental results at LIC2022, Language and Intelligence Technology Competition, show that our module outperforms the baseline model by a large margin on automatic evaluation metrics, while human evaluation by the Baidu Linguistics team shows that our system achieves impressive results in Factually Correct and Knowledgeable.

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here