CoHS-CQG: Context and History Selection for Conversational Question Generation

Page view(s)
39
Checked on Sep 13, 2023
CoHS-CQG: Context and History Selection for Conversational Question Generation
Title:
CoHS-CQG: Context and History Selection for Conversational Question Generation
Journal Title:
International Conference on Computational Linguistics (COLING 2022)
DOI:
Keywords:
Publication Date:
13 October 2022
Citation:
Xuan Long Do, Bowei Zou, Liangming Pan, Nancy F. Chen, Shafiq Joty, and Ai Ti Aw. 2022. CoHS-CQG: Context and History Selection for Conversational Question Generation. In Proceedings of the 29th International Conference on Computational Linguistics, pages 580–591, Gyeongju, Republic of Korea. International Committee on Computational Linguistics.
Abstract:
Conversational question generation (CQG) serves as a vital task for machines to assist humans, such as interactive reading comprehension, through conversations. Compared to traditional single-turn question generation (SQG), CQG is more challenging in the sense that the generated question is required not only to be meaningful, but also to align with the provided conversation. Previous studies mainly focus on how to model the flow and alignment of the conversation, but there has been no thorough study to date on which parts of the context and history are necessary for the model. We postulate that shortening the context and history is crucial as it can help the model to optimise more on the conversational alignment property. To this end, we propose CoHS-CQG, a two-stage CQG framework, which adopts a CoHS module to shorten the context and history of the input. In particular, CoHS selects contiguous sentences and history turns according to their relevance scores by a top-p strategy. Our model achieves state-of-the-art performances on CoQA in both the answer-aware and answer-unaware settings.
License type:
Attribution 4.0 International (CC BY 4.0)
Funding Info:
This research is supported by core funding from: I2R
Grant Reference no. : CR-2021- 001
Description:
ISBN:
2022.coling-1.48
Files uploaded:

File Size Format Action
2022coling-148.pdf 556.03 KB PDF Open