Dynamic Sliding Window Modeling for Abstractive Meeting Summarization

Page view(s)
1
Checked on Nov 19, 2022
Dynamic Sliding Window Modeling for Abstractive Meeting Summarization
Title:
Dynamic Sliding Window Modeling for Abstractive Meeting Summarization
Other Titles:
Interspeech 2022
Keywords:
Publication Date:
16 September 2022
Citation:
Liu, Z., & Chen, N. (2022). Dynamic Sliding Window Modeling for Abstractive Meeting Summarization. Interspeech 2022. https://doi.org/10.21437/interspeech.2022-121
Abstract:
Summarizing spoken content using neural approaches has raised emerging research interest lately, as sequence-to-sequence approaches have improved abstractive summarization performance. However, summarizing long meeting transcripts remains challenging. Meetings are multi-party spoken discussions where information is topically diffuse, making it harder for neural models to distill and cover essential content. Such meeting summarization tasks cannot readily benefit from pre-trained language models, which typically have input length limitations. In this work, we take advantage of the intuition that the topical structure of meetings tends to correlate with the meeting agendas. Inspired by this phenomenon, we propose a dynamic sliding window strategy to elegantly decompose the long source content of meetings to smaller contextualized semantic chunks for more resourceful modeling, and propose two methods without additional trainable parameters for context boundary prediction. Experimental results show that the proposed framework achieves state-of-the-art abstractive summarization performance on the AMI corpus and obtains higher factual consistency on competitive baselines.
License type:
Publisher Copyright
Funding Info:
This research is supported by core funding from: Infocomm Research (I2R)
Grant Reference no. : N/A

This research / project is supported by the National Research Foundation, Prime Minister’s Office - Campus forResearch Excellence and Technological Enterprise (CREATE) programme
Grant Reference no. : N.A
Description:
ISBN:
10.21437/Interspeech.2022-121
Files uploaded: