Capturing Conversational Interaction for Question Answering via Global History Reasoning

Page view(s)
24
Checked on Oct 19, 2024
Capturing Conversational Interaction for Question Answering via Global History Reasoning
Title:
Capturing Conversational Interaction for Question Answering via Global History Reasoning
Journal Title:
Findings of the Association for Computational Linguistics: NAACL 2022
Keywords:
Publication Date:
26 July 2022
Citation:
Qian, J., Zou, B., Dong, M., Li, X., Aw, A., & Hong, Y. (2022). Capturing Conversational Interaction for Question Answering via Global History Reasoning. Findings of the Association for Computational Linguistics: NAACL 2022. https://doi.org/10.18653/v1/2022.findings-naacl.159
Abstract:
Conversational Question Answering (ConvQA) is required to answer the current question, conditioned on the observable paragraph-level context and conversation history. Previous works have intensively studied history-dependent reasoning. They perceive and absorb topic-related information of prior utterances in the interactive encoding stage. It yielded significant improvement compared to history-independent reasoning. This paper further strengthens the ConvQA encoder by establishing long-distance dependency among global utterances in multi-turn conversation. We use multi-layer transformers to resolve long-distance relationships, which potentially contribute to the reweighting of attentive information in historical utterances. Experiments on QuAC show that our method obtains a substantial improvement (1%), yielding the F1 score of 73.7%.
License type:
Publisher Copyright
Funding Info:
This research is supported by core funding from: I2R
Grant Reference no. : CR-2021- 001
Description:
ISBN:

Files uploaded:

File Size Format Action
2022findings-naacl159-revised.pdf 374.50 KB PDF Open