Zhou, R., Li, X., He, R., Bing, L., Cambria, E., Si, L., & Miao, C. (2022). MELM: Data Augmentation with Masked Entity Language Modeling for Low-Resource NER. Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers). https://doi.org/10.18653/v1/2022.acl-long.160
Abstract:
Data augmentation is an effective solution to
data scarcity in low-resource scenarios. However, when applied to token-level tasks such
as NER, data augmentation methods often suffer from token-label misalignment, which leads
to unsatisfactory performance. In this work,
we propose Masked Entity Language Modeling
(MELM) as a novel data augmentation framework for low-resource NER. To alleviate the
token-label misalignment issue, we explicitly
inject NER labels into sentence context, and
thus the fine-tuned MELM is able to predict
masked entity tokens by explicitly conditioning on their labels. Thereby, MELM generates high-quality augmented data with novel
entities, which provides rich entity regularity knowledge and boosts NER performance.
When training data from multiple languages are
available, we also integrate MELM with codemixing for further improvement. We demonstrate the effectiveness of MELM on monolingual, cross-lingual and multilingual NER
across various low-resource levels. Experimental results show that our MELM presents substantial improvement over the baseline methods.
License type:
Publisher Copyright
Funding Info:
This research / project is supported by the Agency for Science, Technology and Research (A*STAR) - AME Programmatic Grant
Grant Reference no. : A18A2b0046