Zhan, H., Kim, J.-J., & Liu, G. (2024, April 14). Contrastive Learning with Bidirectional Transformers for Knowledge Tracing. ICASSP 2024 - 2024 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). https://doi.org/10.1109/icassp48485.2024.10446887
Abstract:
Knowledge tracing aims to predict students’ probability of correctly answering the next question based on their interaction history. Previous methods employ left-to-right uni-directional transformers to encode the historical behaviors into hidden representations, especially with contrastive learning methods. Using uni-directional models to model student behaviors can only learn the hidden representation from its previous items, which restricts the power of the representation capability. Inspired by the success of BERT in text understanding, we propose a novel Bidirectional Transformer encoder guided Contrastive Learning framework for deep
Knowledge Tracing system, named as Bi-CL4KT to generate the accurate response predictions for the next question. We incorporate the Cloze task and carefully design data augmentation methods to generate high-quality positive and negative instances for contrastive learning. Extensive experiments conducted on three real-world education datasets show that the proposed method significantly outperforms the state-of-the-art methods.
License type:
Publisher Copyright
Funding Info:
This research / project is supported by the Ministry of Education - Science of Learning Grant
Grant Reference no. : MOE-MOESOL2021- 0006