计算机科学 ›› 2026, Vol. 53 ›› Issue (2): 39-47.doi: 10.11896/jsjkx.250600005

• 基于图机器学习的教育数据挖掘 • 上一篇    下一篇

基于方向感知孪生网络的知识概念先序关系预测方法

杨明, 贺超波, 杨佳琦   

  1. 华南师范大学计算机学院 广州 510631
  • 收稿日期:2025-05-31 修回日期:2025-09-28 发布日期:2026-02-10
  • 通讯作者: 贺超波(hechaobo@foxmail.com)
  • 作者简介:(2023023231@m.scnu.edu.cn)
  • 基金资助:
    国家自然科学基金(62477016);广东省基础与应用基础研究基金项目(2024A1515011758,2024A1515140144)

Direction-aware Siamese Network for Knowledge Concept Prerequisite Relation Prediction

YANG Ming, HE Chaobo, YANG Jiaqi   

  1. School of Computer Science,South China Normal University,Guangzhou 510631,China
  • Received:2025-05-31 Revised:2025-09-28 Online:2026-02-10
  • About author:YANG Ming,born in 1999,master candidate.His main research interests include educational knowledge graph and graph neural networks.
    HE Chaobo,born in 1981,professor,Ph.D supervisor,is a distinguished member of CCF(No.13911D).His main research interests include graph data mining and intelligent education.
  • Supported by:
    National Natural Science Foundation of China(62477016) and Guangdong Basic and Applied Basic Research Foundation(2024A1515011758,2024A1515140144).

摘要: 知识概念先序关系预测旨在通过挖掘知识概念间的语义和拓扑关系信息补全课程知识图谱,从而提升海量教学资源组织和个性化学习路径规划等下游任务的性能。现有的基于特征工程与深度学习的方法在实体语义信息和先序关系方向性建模方面仍存在局限性,知识概念先序关系预测性能仍有提升的空间。针对该问题,设计了一种基于方向感知孪生网络的知识概念先序关系预测方法DSN-PRL。DSN-PRL首先采用基于对比学习的预训练语言模型BERT学习知识概念的语义表示,然后应用图神经网络聚合多跳拓扑特征以增强层次化结构建模,最后设计方向感知孪生网络,用于学习先序关系的方向性差异以进行预测。在3个基准数据集上进行相关实验,DSN-PRL在多个关键评价指标上均优于现有基线方法,特别是相比表现最优的基线模型DGPL,其精确率分别提升了7.3个百分点,2.7个百分点和11.4个百分点,F1分别提升了1.6个百分点,1.3个百分点和4.3个百分点。

关键词: 知识概念先序关系预测, 预训练语言模型, 对比学习, 图神经网络, 孪生网络

Abstract: Prerequisite relation prediction for knowledge concepts seeks to enhance the curriculum knowledge graph by exploring semantic and topological dependencies among concepts,thereby improving downstream applications such as large-scale resource organization and personalized learning path planning.Existing methods,which mainly rely on feature engineering and deep lear-ning,still struggle to effectively model entity-level semantics and the directional nature of prerequisite relations,leaving room for further improvement.To address this problem,this paper proposes a direction-aware siamese network for knowledge concept prerequisite relation learning(DSN-PRL).Firstly,DSN-PRL employs a contrastive learning-based pre-trained language model,BERT,to capture fine-grained semantic representations of knowledge concepts.It then applies a graph neural network to incorporate multi-hop topological features and enhance hierarchical structure modeling.Finally,a direction-aware siamese network is designed to learn the directional distinctions between concept pairs for accurate prerequisite relation prediction.Experiments conducted on three benchmark datasets demonstrate that DSN-PRL outperforms existing baseline methods across multiple key evaluation metrics.In particular,compared with the best baseline model DGPL,DSN-PRL improves precision by 7.3 percentage points,2.7 percentage points,and 11.4 percentage points,and F1 by 1.6 percentage points,1.3 percentage points,and 4.3 percentage points,respectively.

Key words: Prerequisite relation prediction of knowledge concepts, Pre-trained language model, Contrastive learning, Graph neural network, Siamese network

中图分类号: 

  • TP391
[1]ZHOU K Y,MENG Q X,CAO Y Y,et al.Exploration and practice of problem-oriented knowledge graph in college physics curriculum[J].College Physics,2025,44(1):66.
[2]ZHANG C X,PENG C,LUO M Q,et al.Construction of mathematics course knowledge graph and its reasoning[J].Computer Science,2020,47(S2):573-578.
[3]WANG S,LIU L.Prerequisite concept maps extraction for automaticassessment[C]//Proceedings of the 25th International Conference Companion on World Wide Web.2016:519-521.
[4]LIN Y,ZHANG W,LIN F,et al.Knowledge-aware reasoning with self-supervised reinforcement learning for explainable re-commendation in MOOCs[J].Neural Computing and Applications,2024,36(8):4115-4132.
[5]GONG J B,WANG S,WANG J L,et al.Attentional graph convolutional networks for knowledge concept recommendation in MOOCs in a heterogeneous view[C]//Proceedings of the 43rd International ACM SIGIR Conference on Research and Development in Information Retrieval.2020:79-88.
[6]GUAN Q,XIAO F,CHENG X,et al.Kg4ex:An explainableknowledge graph-based approach for exerciserecommendation[C]//Proceedings of the 32nd ACM International Conference on Information and Knowledge Management.2023:597-607.
[7]BERGAN J R,JESKA P.An examination of prerequisite relations,positive transfer among learning tasks,and variations in instruction for a seriation hierarchy[J].Contemporary Educational Psychology,1980,5(3):203-215.
[8]TANG X S,CHENG L Y,ZHANG C H,et al.Application of large language models in the automated construction of disciplinary knowledge graphs[J].Journal of Beijing University of Posts and Telecommunications(Social Sciences Edition),2024,26(1):125.
[9]SUN H,LI Y,ZHANG Y.ConLearn:contextual-knowledge-aware concept prerequisite relation learning with graph neural network[C]//Proceedings of the 2022 SIAM International Conference on Data Mining.2022:118-126.
[10]XU G L,BAI R J.Learning with double graph for concept prerequisite discovering[J].Data Analysis andKnowledge Disco-very,2024,8(5):38-45.
[11]DEVLIN J,CHANG M W,LEE K,et al.Bert:Pre-training of deep bidirectional transformers for language understanding[C]//Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics:Human Language Technologies.2019:4171-4186.
[12]PAN L,LI C,LI J,et al.Prerequisite relation learning for concepts in moocs[C]//Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics.2017:1447-1456.
[13]LIANG C,WU Z,HUANG W,et al.Measuring prerequisite relations among concepts[C]//Proceedings of the 2015 Confe-rence on Empirical Methods in Natural Language Processing.2015:1668-1674.
[14]LIANG C,YE J,WANG S,et al.Investigating active learningfor concept prerequisite learning[C]//Proceedings of the AAAI Conference on Artificial Intelligence.2018:7913-7919.
[15]ROY S,MADHYASTHA M,LAWRENCE S,et al.Inferring concept prerequisite relations from online educational resources[C]//Proceedings of the AAAI Conference on Artificial Intelligence.2019:9589-9594.
[16]LI I,FABBRI A R,TUNG R R,et al.What should i learn first:Introducing lecturebank for nlp education and prerequisite chain learning[C]//Proceedings of the AAAI Conference on Artificial Intelligence.2019:6674-6681.
[17]JIA C,SHEN Y,TANG Y,et al.Heterogeneous graph neural networks for concept prerequisite relation learning in educatio-nal data[C]//Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics:Human Language Technologies.2021:2036-2047.
[18]ZHANG J,LIN N,ZHANG X,et al.Learning concept prerequisite relations from educational data via multi-head attention variational graph auto-encoders[C]//Proceedings of the 15th ACM International Conference on Web Search and Data Mining.2022:1377-1385.
[19]MAZUMDER D,PAIK J H,BASU A.A graph neural network model for concept prerequisite relation extraction[C]//Procee-dings of the 32nd ACM International Conference on Information and Knowledge Management.2023:1787-1796.
[20]QU X,SHANG X,ZHANG Y.Concept prerequisite relationprediction by using permutation-equivariant directed graph neural networks[C]//Proceedings of Machine Learning Research.2024:39-47.
[21]DU F,HU W,QIN C J,et al.A prompt-based approach for discovering prerequisite relations among concepts[C]//Proceedings of the 3rd International Conference on Internet Technology and Educational Informatization.2024:1-13.
[22]CHEN T,KORNBLITH S,NOROUZI M,et al.A simpleframework for contrastive learning of visual representations[C]//Proceedings of the 37th International Conference on Machine Learning.2020:1597-1607.
[23]KHOSLA P,TETERWAK P,WANG C,et al.Supervised con-trastive learning[C]//Proceedings of Advances in Neural Information Processing Systems.2020:18661-18673.
[24]WANG Z,SHEN Y,ZHANG Z,et al.Relative contrastivelearning for sequential recommendation with similarity-based positive sample selection[C]//Proceedings of the 33rd ACM International Conference on Information and Knowledge Ma-nagement.2024:2493-2502.
[25]WU S W,SHEN X Q,XIA R.Commonsense knowledge graph completion via contrastive pretrainingand node clustering[C]//Findings of the Association for Computational Linguistics.2023:13977-13989.
[26]LIANG C,YE J B,WU Z H,et al.Recovering concept prerequisite relations from university course dependencies[C]//Procee-dings of the 31st AAAI Conference on Artificial Intelligence.2017:4786-4791.
[27]YU J,LUO G,XIAO T,et al.MOOCCube:A large-scale data repository for NLP applications in MOOCs[C]//Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics.2020:3135-3142.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!