Leveraging Contrastive Learning and Knowledge Distillation for Incomplete Modality Rumor Detection

Page view(s)
25
Checked on Nov 28, 2024
Leveraging Contrastive Learning and Knowledge Distillation for Incomplete Modality Rumor Detection
Title:
Leveraging Contrastive Learning and Knowledge Distillation for Incomplete Modality Rumor Detection
Journal Title:
Findings of the Association for Computational Linguistics: EMNLP 2023
Keywords:
Publication Date:
10 December 2023
Citation:
Xu, F., Fu, P., Huang, Q., Zou, B., Aw, A., & Wang, M. (2023). Leveraging Contrastive Learning and Knowledge Distillation for Incomplete Modality Rumor Detection. Findings of the Association for Computational Linguistics: EMNLP 2023. https://doi.org/10.18653/v1/2023.findings-emnlp.900
Abstract:
Rumors spread rapidly through online social microblogs at a relatively low cost, causing substantial economic losses and negative consequences in our daily lives. Existing rumor detection models often neglect the underlying semantic coherence between text and image components in multimodal posts, as well as the challenges posed by incomplete modalities in single modal posts, such as missing text or images. This paper presents CLKD-IMRD, a novel framework for Incomplete Modality Rumor Detection. CLKD-IMRD employs Contrastive Learning and Knowledge Distillation to capture the semantic consistency between text and image pairs, while also enhancing model generalization to incomplete modalities within individual posts. Extensive experimental results demonstrate that our CLKD-IMRD outperforms state-of-the-art methods on two English and two Chinese benchmark datasets for rumor detection in social media.
License type:
Attribution 4.0 International (CC BY 4.0)
Funding Info:
This research was supported by the National Natural Science Foundation of China under Grant 62162031, the Natural Science Fund project in Jiangxi province under Grant 20224ACB202010, and the National Natural Science Foundation of China under Grant 62266023.
Description:
ISBN:
2023.findings-emnlp.900
Files uploaded:

File Size Format Action
2023findings-emnlp900.pdf 1.18 MB PDF Open