Soft Syntactic Reinforcement for Neural Event Extraction

Page view(s)
0
Checked on
Soft Syntactic Reinforcement for Neural Event Extraction
Title:
Soft Syntactic Reinforcement for Neural Event Extraction
Journal Title:
Proceedings of the 2025 Conference of the Nations of the Americas Chapter of the Association for Computational Linguistics: Human Language Technologies (Volume 1: Long Papers)
Keywords:
Publication Date:
07 June 2025
Citation:
Hao, A., Su, J., Sun, S., & Sen, T. Y. (2025). Soft Syntactic Reinforcement for Neural Event Extraction. Proceedings of the 2025 Conference of the Nations of the Americas Chapter of the Association for Computational Linguistics: Human Language Technologies (Volume 1: Long Papers), 9466–9478. https://doi.org/10.18653/v1/2025.naacl-long.479
Abstract:
Recent event extraction (EE) methods rely on pre-trained language models (PLMs) but still suffer from errors due to a lack of syntactic knowledge. While syntactic information is crucial for EE, there is a need for effective methods to incorporate syntactic knowledge into PLMs. To address this gap, we present a novel method to incorporate syntactic information into PLM-based models for EE, which do not require external syntactic parsers to produce syntactic features of task data. Instead, our proposed soft syntactic reinforcement (SSR) mechanism learns to select syntax-related dimensions of PLM representation during pretraining on a standard dependency corpus. The adapted PLM weights and the syntax-aware representation then facilitate the model’s prediction over the task data. On both sentence-level and document-level EE benchmark datasets, our proposed method achieves state-of-the-art results, outperforming baseline models and existing syntactic reinforcement methods. To the best of our knowledge, this is the first work in this direction. Our code is available at https://github.com/Anran971/sre-naacl25.
License type:
Attribution 4.0 International (CC BY 4.0)
Funding Info:
There was no specific funding for the research done
Description:
ISBN:
2025.naacl-long.479