计算机科学 ›› 2026, Vol. 53 ›› Issue (2): 31-38.doi: 10.11896/jsjkx.250700196

• 基于图机器学习的教育数据挖掘 • 上一篇    下一篇

基于双层级对比学习的健壮知识追踪模型

陈晓岚1,6, 毛舜2, 李伟生3, 林荣华4, 汤庸4,5   

  1. 1 华南师范大学教育信息技术学院 广州 510631
    2 广州航海学院人工智能学院 广州 510725
    3 广东开放大学人工智能学院 广州 510091
    4 华南师范大学计算机学院 广州 510631
    5 广东科技学院数据智能研究院 广东 东莞 523083
    6 广州市黄埔区荔园小学 广州 510700
  • 收稿日期:2025-07-31 修回日期:2025-10-29 发布日期:2026-02-10
  • 通讯作者: 林荣华(rhlin@m.scnu.edu.cn)
  • 作者简介:(xlchen@m.scnu.edu.cn)
  • 基金资助:
    国家自然科学基金(62407016)

Robust Knowledge Tracing Model Based on Two-level Contrastive Learning

CHEN Xiaolan1,6, MAO Shun2, LI Weisheng3, LIN Ronghua4, TANG Yong4,5   

  1. 1 School of Information Technology in Education,South China Normal University,Guangzhou 510631,China
    2 School of Artificial Intelligence,Guangzhou Maritime University,Guangzhou 510725,China
    3 School of Artificial Intelligence,Guangdong Open University,Guangzhou 510091,China
    4 School of Computer Science,South China Normal University,Guangzhou 510631,China
    5 Institute of Data Intelligence,Guangdong University of Science and Technology,Dongguan,Guangdong 523083,China
    6 Liyuan Primary School,Huangpu District,Guangzhou 510700,China
  • Received:2025-07-31 Revised:2025-10-29 Online:2026-02-10
  • About author:CHEN Xiaolan,born in 1997,postgra-duate.Her main research interests include knowledge tracing,educational data analysis and applications,and AI-empowered primary education.
    LIN Ronghua,born in 1994,Ph.D,associate research fellow,is a member of CCF(No.C5693M).His main research interests include educational data mi-ning,recommender system and social network analysis.
  • Supported by:
    National Natural Science Foundation of China(62407016).

摘要: 知识追踪是实现自适应学习的关键,它的目的是为了评估学生的知识状态并预测他们的未来表现。目前,数据的稀疏性问题使得现有的知识追踪模型在问题嵌入学习和学生知识状态模拟两个方面受到了限制。因此,一些研究引入了对比学习来缓解这一问题。然而,现有的对比学习方法依赖随机扰动图结构(用于问题嵌入)或修改学习交互序列(用于知识状态建模)生成对比视图,引入了噪声与错误自监督信号,导致问题嵌入无法良好适配学习系统下游任务。因此,提出一种创新的双层对比学习框架(Dual-level Contrastive Learning Framework,DCLF)用于同时增强知识追踪中问题的嵌入学习和学生知识状态模拟两个方面。DCLF采用了一种更有效的对比范式,这种对比范式的主要思想是不改变数据的原本信息,通过对原始数据进行关系变换或利用不同神经网络在同一数据上的输出作为对比视图。具体来说,在嵌入学习中,所提出的方法通过对数据进行关系变化,获得对比视图。在学生知识状态模拟上,所提模型使用不同的神经网络对学习交互进行编码,获得不同编码器下的知识状态。这种方法能够提取多个对比视图下丰富的自监督信号,保留数据的内在语义信息,有效避免噪声的引入。在3个常用的知识追踪数据集上进行实验验证,实验结果表明DCLF在性能上优于现有知识追踪模型。

关键词: 知识追踪, 自适应学习, 对比学习, 对比范式, 自监督学习

Abstract: Knowledge tracing is key to adaptive learning,aiming to assess students’ knowledge states and predict their future performance.Currently,data sparsity limits existing knowledge tracing models in both question embedding learning and student knowledge state modeling.To address this,some studies have introduced contrastive learning.However,existing contrastive learning methods rely on random perturbations of graph structures(for question embedding) or modifications of learning interaction sequences(for knowledge state modeling) to generate contrastive views,introducing noise and erroneous self-supervised signals.This results in question embeddings that are poorly suited for downstream tasks in learning systems.To overcome these li-mitations,this study proposes an innovative Dual-level Contrastive Learning Framework(DCLF) to simultaneously enhance question embedding learning and student knowledge state modeling in knowledge tracing.DCLF adopts a more effective contrastive paradigm that avoids altering the original data information.Instead,it generates contrastive views through relational transformations of the original data or by leveraging outputs from different neural networks on the same data.Specifically,for embedding learning,the proposed method obtains contrastive views through relational transformations of the data.For student knowledge state modeling,it encodes learning interactions using different neural networks to obtain knowledge states under various encoders.This method extracts rich self-supervised signals from multiple contrastive views,preserving the intrinsic semantic information of the data and effectively avoiding noise introduction.Experiments conducted on three commonly used datasets demonstrate that DCLF outperforms selected existing knowledge tracing models in terms of performance.

Key words: Knowledge tracing, Adaptive learning, Contrastive learning, Contrastive paradigm, Self supervised learning

中图分类号: 

  • TP181
[1]ABDELRAHMAN G,WANG Q,NUNES B.Knowledge Tra-cing:A Survey[J].ACM Computing Surveys,2023,55(11):1-37.
[2]PIECH C,BASSEN J,HUANG J,et al.Deep Knowledge Tra-cing[C]//Proceedings of Advances in Neural Information Processing Systems.Curran Associates Inc.,2015.
[3]ZHANG J,SHI X,KING I,et al.Dynamic Key-Value Memory Networks for Knowledge Tracing[C]//Proceedings of the 26th International Conference on World Wide Web.New York:ACM,2017:765-774.
[4]PANDEY S,KARYPIS G.A Self-Attentive Model for Know-ledge Tracing[J].arXiv:1907.06837,2019.
[5]NAKAGAWA H,IWAWA Y,MATSUO Y.Graph-BasedKnowledge Tracing:Modeling Student Proficiency Using Graph Neural Network[C]//Proceedings of IEEE/WIC/ACM International Conference on Web Intelligence.New York:ACM,2019:156-163.
[6]KHOSLA P,TETERWAK P,WANG C,et al.Supervised Contrastive Learning[C]//Proceedings of Advances in Neural Information Processing Systems.Curran Associates Inc.,2020:18661-18673.
[7]PENG W,LI S,ZHANG W.Investigation and Analysis onE-Learning Behavior of Spare-Time Students[C]//Proceedings of 2011 International Conference on Internet Computing and Information Services.New York:IEEE,2011:381-384.
[8]SONG X,LI J,LEI Q,et al.Bi-CLKT:Bi-Graph ContrastiveLearning-Based Knowledge Tracing[J].Knowledge-Based Systems,2022,241:108274.
[9]WU T,LING Q.Self-Supervised Heterogeneous HypergraphNetwork for Knowledge Tracing[J].Information Sciences,2023,624:200-216.
[10]SUN X,YIN H,LIU B,et al.Heterogeneous Hypergraph Embedding for Graph Classification[C]//Proceedings of the 14th ACM International Conference on Web Search and Data Mi-ning.New York:ACM,2021:725-733.
[11]SHEN S,LIU Q,HUANG Z,et al.A Survey of KnowledgeTracing:Models,Variants,and Applications[J].IEEE Transactions on Learning Technologies,2024,17:1858-1879.
[12]LIU Z,GUO T,LIANG Q,et al.Deep Learning Based Know-ledge Tracing:A Review,A Tool and Empirical Studies[J].IEEE Transactions on Knowledge and Data Engineering,2025,37(8):4512-4536.
[13]NAGATANI K,ZHANG Q,SATO M,et al.AugmentingKnowledge Tracing by Considering Forgetting Behavior[C]//Proceedings of The World Wide Web Conference.New York:ACM,2019:3101-3107.
[14]GHOSH A,HEFFERNAN N,LAN A S.Context-Aware Attentive Knowledge Tracing[C]//Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining.New York:ACM,2020:2330-2339.
[15]YANG Y,SHEN J,QU Y,et al.GIKT:A Graph-Based Interaction Model for Knowledge Tracing[C]//Proceedings of Joint European Conference on Machine Learning and Knowledge Discovery in Databases.Cham:Springer,2020:299-315.
[16]TSUTSUMI E,KINOSHITA R,UENO M.Deep-IRT with Independent Student and Item Networks[C]//Proceedings of the International Educational Data Mining Society Conference.ERIC,2021:510-517.
[17]ZHOU N,DONG Y Q,YAN L K,et al.Investigation on Exercise Recommendation Integrating Student Knowledge State and Chaos Firefly Algorithm[J].Journal of Frontiers of Computer Science & Technology,2025,19(6):1620-1631.
[18]ZHAO Y J,MENG F J,XU X J.A Review of KnowledgeTracking for Online Education Learners[J].Journal of Computer Applications,2024,44(6):1683-1698.
[19]ZHAO Y,MA H,WANG J,et al.Question-Response Representation with Dual-Level Contrastive Learning for Improving Knowledge Tracing[J].Information Sciences,2024,658:120032.
[20]ZHANG H,LIU Z,SHANG C,et al.A Question-Centric Multi-Experts Contrastive Learning Framework for Improving the Accuracy and Interpretability of Deep Sequential Knowledge Tra-cing Models[J].ACM Transactions on Knowledge Discovery from Data,2025,19(2):1-25.
[21]LEE W,CHUN J,LEE Y,et al.Contrastive Learning forKnowledge Tracing[C]//Proceedings of the ACM Web Confe-rence 2022.New York:ACM,2022:2330-2338.
[22]ZHENG N,SHAN Z.Co-Attention and Contrastive LearningDriven Knowledge Tracing[C]//Proceedings of Joint European Conference on Machine Learning and Knowledge Discovery in Databases.Cham:Springer,2024:177-194.
[23]DAI H,YUN Y,ZHANG Y,et al.Self-Paced ContrastiveLearning for Knowledge Tracing[J].Neurocomputing,2024,609:128366.
[24]YAO L,MAO C,LUO Y.Graph Convolutional Networks for Text Classification[C]//Proceedings of the AAAI Conference on Artificial Intelligence.AAAI,2019:7370-7377.
[25]HOCHREITER S,SCHMIDHUBER J.Long Short-TermMe-mory[J].Neural Computation,1997,9(8):1735-1780.
[26]ZHAO Y,TUO J,SHAN K,et al.Overview of KnowledgeTracking Models in the Field of Intelligent Education[J].Journal of Computer System & Applications,2025,34(6):1-11.
[27]YEUNG C K,YEUNG D Y.Addressing Two Problems in Deep Knowledge Tracing via Prediction-Consistent Regularization[C]//Proceedings of the Fifth Annual ACM Conference on Learning at Scale.New York:ACM,2018:1-10.
[28]LI Q,HUANG Z,SUN J,et al.HKT:Hierarchical Structure-based Knowledge Tracing[J].Information Processing & Ma-nagement,2025,62(5):104206.
[29]XIE P Z,LI G J,LI T.Knowledge Tracing Model Based onExercise-Knowledge Point Heterogeneous Graph and Multi-feature Fusion[J].Computer Science,2025,52(3):197-205.
[30]ZHANG W,LUO P H,GONG Z W,et al.Multi Relationshipand Time Enhanced Knowledge Tracking Model[J].Application Research of Computers,2025,42(3):728-734.
[31]WU Z,HUANG L,HUANG Q,et al.SGKT:Session Graph-Based Knowledge Tracing for Student Performance Prediction[J].Expert Systems with Applications,2022,206:117681.
[32]TONG H,WANG Z,ZHOU Y,et al.Introducing Problem Schema with Hierarchical Exercise Graph for Knowledge Tracing[C]//Proceedings of the 45th International ACM SIGIR Conference on Research and Development in Information Retrieval.New York:ACM,2022:405-415.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!