计算机科学 ›› 2026, Vol. 53 ›› Issue (2): 78-88.doi: 10.11896/jsjkx.250700188

• 基于图机器学习的教育数据挖掘 • 上一篇    下一篇

GTKT:融合联通主义学习和多层时序图Transformer的知识追踪模型

李佳豪1, 荆军昌1, 徐茜1, 刘栋1,2   

  1. 1 河南师范大学计算机与信息工程学院 河南 新乡 453007
    2 河南省教育人工智能与个性化学习重点实验室 河南 新乡 453007
  • 收稿日期:2025-07-30 修回日期:2025-10-25 发布日期:2026-02-10
  • 通讯作者: 刘栋(liudong@htu.edu.cn)
  • 作者简介:(2208183032@stu.htu.edu.cn)
  • 基金资助:
    国家自然科学基金(62072160)

GTKT:Knowledge Tracing Model Integrating Connectivism Learning and Multi-layer TemporalGraph Transformer

LI Jiahao1, JING Junchang1, XU Qian1, LIU Dong1,2   

  1. 1 School of Computer andInformation Engineering,Henan Normal University,Xinxiang,Henan 453007,China
    2 Key Laboratory of Educational Artificial Intelligence and Personalized Learning of Henan Province,Xinxiang,Henan 453007,China
  • Received:2025-07-30 Revised:2025-10-25 Online:2026-02-10
  • About author:LI Jiahao,born in 1999,postgraduate.His main research interests include big data in education and knowledge tra-cing.
    LIU Dong,born in 1976.Ph.D,professor,Ph.D supervisor,is a senior member of CCF(No.21678S).His main research interests include big data mining in education,social network analysis,etc.
  • Supported by:
    National Natural Science Foundation of China(62072160).

摘要: 知识追踪(Knowledge Tracing,KT)是根据学习者在一定学习周期内的历史答题记录,构建其知识状态,并预测其未来回答问题情况。传统知识追踪研究主要以学习者行为序列为研究对象,忽略了知识之间的拓扑结构关系。近年来,基于知识静态图的知识追踪方法取得了一定进展,但未充分考虑学习者、问题和知识点之间的动态图结构关系,忽略了学习者知识掌握过程中潜在的关联信息,导致模型泛化能力和可解释性较弱。针对以上问题,提出融合联通主义学习和多层时序图Transfor-mer的知识追踪模型(Graph Transformer Knowledge Tracing,GTKT)。首先,以联通主义学习理论为指导,构建学习者时序子图用于刻画学习者的历史练习序列,提出时间感知分层子图采样策略,利用邻居共现编码器挖掘节点之间的潜在关联;其次,以学习和遗忘效应理论为依据,提出了一种多频带时间编码器,用于捕捉学习者回答问题的时间特性,构建学习者-问题-知识点交互信息的多特征融合模块;再次,构建了多层时序图Transformer学习者知识追踪预测模块,实现学习者知识状态的动态建模与预测;最后,在6个公共数据集上的实验结果表明,GTKT在预测学习者准确率方面优于主流知识追踪模型。

关键词: 知识追踪, 联通主义学习, 教育理论, 图 Transformer, 多特征融合

Abstract: Knowledge Tracing(KT) aims to model learners’ knowledge states based on their historical exercise records and predict their future performance.Traditional KT research primarily focuses on modeling learners’ behavioral sequences while overlooking the topological structure among knowledge concepts.Although recent methods using static knowledge graphs have shown progress,they fail to adequately capture the dynamic graph-structured relationships among learners,questions,and knowledge concepts,thereby ignoring potential correlations in the knowledge acquisition process and limiting model generalizability and interpretability.To address these limitations,this paper proposes a Graph Transformer Knowledge Tracing(GTKT) model that integrates connectivism learning theory with a multi-layer temporal graph Transformer.Firstly,guided by connectivism learning theory,it constructs temporal learner subgraphs to represent historical exercise sequences,proposing a time-aware hierarchical subgraph sampling strategy and a neighbor co-occurrence encoder to discover latent node relationships.Secondly,based on lear-ning and forgetting theories,it designs a multi-band temporal encoder to capture temporal characteristics in learning behaviors and builds a multi-feature fusion module integrating learner-question-knowledge concepts interactions.Thirdly,it develops a multi-layer temporal graph Transformer module for dynamic knowledge state modeling and prediction.Experimental results on six public datasets demonstrate that GTKT outperforms mainstream knowledge tracing models in predicting learner performance.

Key words: Knowledge tracing, Connectivism learning, Educational theory, Graph Transformer, Multi-features fusion

中图分类号: 

  • TP391.6
[1]YAN Q Y,SUN H,SI Y Q,et al.Multimodality and Forgetting Mechanisms Model for Knowledge Tracing[J].Computer Science,2024,51(7):133-139.
[2]XIE P Z,LI G J,LI T.Knowledge Tracing Model Based on Exercise-Knowledge Point Heterogeneous Graph and Multi-feature Fusion[J].Computer Science,2025,52(3):197-205.
[3]DONG Y F,HUANG G,XUE W R,et al.Graph Attention Deep Knowledge Tracing Model Integrated with IRT[J].Computer Science,2023,50(3):173-180.
[4]BAKER R S J D,CORBETT A T,ALEVEN V.More Accurate Student Modeling Through Contextual Estimation of Slip and Guess Probabilities in Bayesian Knowledge Tracing[C]//Proceedings of the 9th International Conference on Intelligent Tutoring Systems(ITS).2008:406-415.
[5]EMBRETSON S E,REISE S P.Item Response Theory[M].NewYork:Psychology Press,2013.
[6]PAVLIK P I,CEN H,KOEDINGER K R.Performance Factors Analysis - A New Alternative to Knowledge Tracing[C]//Proceedings of the 2009 Conference on Artificial Intelligence in Education(AIED).2009:531-538.
[7]PIECH C,BASSEN J,HUANG J,et al.Deep Knowledge Tra-cing[C]//Proceedings of the 28th Conference on Neural Information Processing Systems(NeurIPS).2015:505-513.
[8]LIPTON Z C,BERKOWITZ J,ELKAN C.A Critical Review of Recurrent Neural Networks for Sequence Learning[J].arXiv:1506.00019,2015.
[9]GHOSH A,HEFFERNAN N,LAN A S.Context-aware Attentive Knowledge Tracing[C]//Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining(KDD).ACM,2020:2330-2339.
[10]YIN Y,DAI L,HUANG Z Y,et al.Tracing Knowledge Instead of Patterns:Stable Knowledge Tracing with Diagnostic Transformer[C]//Proceedings of the 26th ACM Web Conference 2023(WWW).ACM,2023:855-864.
[11]VASWANI A,SHAZEER N,PARMAR N,et al.Attention isAll You Need[C]//Proceedings of the 31st Conference on Neural Information Processing Systems(NeurIPS).ACM,2017:6000-6010.
[12]NAKAGAWA H,IWASAWA Y,MAEDA Y.Graph-basedKnowledge Tracing:Modeling Student Proficiency Using Graph Neural Network[C]//Proceedings of the 2019 IEEE/WIC/ACM International Conference on Web Intelligence(WI).2019:156-163.
[13]CUI C R,YAO Y M,ZHANG C Y,et al.DGEKT:A DualGraph Ensemble Learning Method for Knowledge Tracing[J].ACM Transactions on Information Systems,2024,42(3):1-24.
[14]YANG Y,SHEN J J,QU Y R,et al.GIKT:A Graph-based Interaction Model for Knowledge Tracing[C]//Proceedings of the 2020 European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases(ECML-PKDD).2020:299-315.
[15]WU Z,HUANG L,HUANG Q,et al.SGKT:Session Graph-based Knowledge Tracing for Student Performance Prediction[J].Expert Systems with Applications,2022,206:117681.
[16]CHENG K,PENG L Z,WANG P Y,et al.DyGKT:DynamicGraph Learning for Knowledge Tracing[C]//Proceedings of the 30th ACM SIGKDD Conference on Knowledge Discovery and Data Mining(KDD).ACM,2024:409-420.
[17]CHEN Z Y,SHAN Z L.Research Advances in Knowledge Tra-cing[J].Computer Science,2022,49(10):83-95.
[18]ANDERSON J R,BOYLE C F,CORBETT A T,et al.Cognitive Modeling and Intelligent Tutoring[M]//Artificial Intelligence.1990:7-49.
[19]PERKINS D N,SALOMON G.Transfer of Learning[J].International Encyclopedia of Education,1992,2:6452-6457.
[20]ROEDIGER H L,KARPICKE J D.Test-enhanced Learning:Taking Memory Tests Improves Long-term Retention[M]//Psychological Science.2006:249-255.
[21]MAYO R C M,RUSSELL S,NUGENT T,et al.Revealing the learning in learning curves[C]//Proceedings of the 16th International Conference on Artificial Intelligence in Education(AIED).2013:473-482.
[22]EBBINGHAUS H.Memory:A Contribution to ExperimentalPsychology[J].Annals of Neurosciences,2013,20(4):155-156.
[23]YELLE L E.The LearningCurve:Historical Review and Comprehensive Survey[J].Decision Sciences,1979,10(2):302-328.
[24]BRUNER J S.Toward a Theory of Instruction[M].Cambridge:Harvard University Press,1966.
[25]HOCHREITER S,SCHMIDHUBER J.Long Short-termMemory[J].Neural Computation,1997,9(8):1735-1780.
[26]YEUNG C K,YEUNG D Y.Addressing Two Problems in Deep Knowledge Tracing via Prediction-Consistent Regularization[C]//Proceedings of the 5th Annual ACM Conference on Learning at Scale(L@S).ACM,2018:1-10.
[27]PANDEY S,KARYPIS G.A Self Attentive Model for Know-ledge Tracing[C]//Proceedings of the 12th International Confe-rence on Educational Data Mining(EDM).2019:384-389.
[28]RASCH G.Probabilistic Models for Some Intelligence and At-tainment Tests[M].Chicago:University of Chicago Press,1993.
[29]LIU Z T,LIU Q Q,CHEN J H,et al.simpleKT:A Simple but Tough-to-Beat Baseline for Knowledge Tracing[C]//Procee-dings of the 11th International Conference on Learning Representations(ICLR).2023.
[30]SHEN S H,LIU Q,CHEN E H,et al.Learning Process-Consistent Knowledge Tracing[C]//Proceedings of the 27th ACM SIGKDD Conference on Knowledge Discovery & Data Mining(KDD).ACM,2021:1452-1460.
[31]LIU Q,HUANG Z Y,YIN Y,et al.EKT:Exercise-awareKnowledge Tracing for Student Performance Prediction[J].IEEE Transactions on Knowledge and Data Engineering,2021,33(1):100-115.
[32]CHEN J H,LIU Z T,HUANG S Y,et al.Improving Interpre-tability of Deep Sequential Knowledge Tracing Models with Question-Centric Cognitive Representations[C]//Proceedings of the 37th AAAI Conference on Artificial Intelligence.AAAI,2023:14196-14204.
[33]CAI L Q,LIU Y C,REN B,et al.A personalized knowledge re-commendation method integrating muti-knowledge point and group characteristics[J].Journal of Chongqing University of Posts & Telecommunications(Natural Science Edition),2024,36(5):1023-1031.
[34]WU S,SUN F,ZHANG W,et al.Graph Neural Networks in Recommender Systems:A Survey[J].ACM Computing Surveys,2023,55(5):1-37.
[35]ALADE Y A,FENG C,MAO X,et al.Graph Neural Networks for Visual Question Answering:A Systematic Review[J].Multimedia Tools and Applications,2023,83:55471-55508.
[36]DAS N,SADHUKHAN B,CHATTERJEE R,et al.Integrating Sentiment Analysis with Graph Neural Networks for Enhanced Stock Prediction:A Comprehensive Survey[J].Decision Analy-tics Journal,2024,10:100417.
[37]LIN Y,RUAN T,LIU J,et al.A Survey on Neural Data-to-Text Generation[J].IEEE Transactions on Knowledge and Data Engineering,2024,36(4):1431-1449.
[38]LIU Y F,YANG Y,CHEN X Y,et al.Improving KnowledgeTracing via Pre-training Question Embeddings[C]//Procee-dings of the 29th International Joint Conference on Artificial Intelligence.2021:1577-1583.
[39]CHEN L,XU Y Q.The Philosophical Outlook of Connectivism and Its Enlightenment on Educational Reform[J].Education Research,2023,3(14):16-25.
[40]DOWNES S.Recent Work in Connectivism[J].European Journal of Open,Distance and E-Learning,2020,22(2):113-133.
[41]YU L,SUN L L,DU B W,et al.Towards Better Dynamic Graph Learning:New Architecture and Unified Library[C]//Procee-dings of the 36th Conference on Neural Information Processing Systems(NeurIPS).2023:67686-67700.
[42]LOSHCHILOV I,HUTTER F.Decoupled Weight Decay Regularization.[C]//Proceedings of the 7th International Conference on Learning Representations(ICLR).2019.
[43]KINGMA D P,BA J.Adam:A Method for Stochastic Optimization[J].arXiv:1412.6980,2014.
[44]ADAM P,SAM G,FRANCISCO M,et al.Pytorch:An imperative style,high-performance deep learning library[C]//Procee-dings of the 33rd Conference on Neural Information Processing Systems(NeurIPS).2019:8026-8037.
[45]GHODAI A,WANG Q.Deep graph memory networks for forgetting-robust knowledge tracing[J].IEEE Transactions on Knowledge and Data Engineering,2022,35(8):7844-7855.
[46]SHEN X X,YU F H,LIU Y Q,et al.Revisiting KnowledgeTracing:A Simple and Powerful Model[C]//Proceedings of the 32nd ACM International Conference on Multimedia(MM).2024:263-272.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!