计算机科学 ›› 2025, Vol. 52 ›› Issue (11): 82-89.doi: 10.11896/jsjkx.240900134

• 数据库&大数据&数据科学 • 上一篇    下一篇

基于自注意力机制的图对比学习推荐算法

胡金涛, 冼广铭   

  1. 华南师范大学人工智能学院 广东 佛山 528225
  • 收稿日期:2024-09-23 修回日期:2024-12-09 出版日期:2025-11-15 发布日期:2025-11-06
  • 通讯作者: 冼广铭(16649995@qq.com)
  • 作者简介:(hujintao@m.scnu.edu.cn)
  • 基金资助:
    华南师范大学研究生科研创新计划(21RJKC15)

Self-attention-based Graph Contrastive Learning for Recommendation

HU Jintao, XIAN Guangming   

  1. School of Artificial Intelligence,South China Normal University,Foshan,Guangdong 528225,China
  • Received:2024-09-23 Revised:2024-12-09 Online:2025-11-15 Published:2025-11-06
  • About author:HU Jintao,born in 1999,postgraduate.His main research interests include re-commendation systems and information retrieval.
    XIAN Guangming,born in 1975,Ph.D,associate professor.His main research interests include artificial intelligence,machine learning,big data and data mining.
  • Supported by:
    Scientific Research Innovation Project of Graduate School of South China Normal University(21RJKC15).

摘要: 随着互联网数据的爆炸性增长,推荐系统已成为解决信息过载问题的关键技术。基于图对比学习的推荐模型通过增强用户-项目交互图,在提升模型性能方面展现出了显著的优势。尽管这些模型取得了一定成功,但现有的大多数方法是通过扰动图结构来进行数据增强,这种方式在保持内在语义结构时表现不佳,且容易受到噪声干扰的影响。为了进一步提升推荐模型的性能,提出了一种新颖的基于自注意力的图对比学习推荐算法(AttGCL)。一方面,集成的自注意力机制能够增强用户与项目之间的联系,从而更精确地捕捉用户偏好和个体差异性。另一方面,采用的ICL损失函数能有效控制正样本和负样本的重要性,从而更好地对齐全局和局部表示。该方法保留了关键的用户-项目交互语义,使得模型不仅能更准确地反映用户偏好,还提升了推荐效果。在3个真实数据集上的实验结果表明,AttGCL在性能上显著优于现有的图对比学习方法,展示了在高效性和鲁棒性上的优势。

关键词: 推荐系统, 图对比学习, 自注意力机制, 图卷积网络, 对比损失函数

Abstract: With the explosive growth of Internet data,recommender systems have become crucial for addressing the problem of information overload.Graph contrastive learning-based recommendation models have demonstrated significant advantages in enhancing model performance by improving user-item interaction graphs.Although these models have achieved some success,most existing methods rely on perturbing graph structures for data augmentation.However,this approach struggles to preserve the inherent semantic structure and is vulnerable to noise interference.To further enhance the performance of recommendation models,this paper proposes a novel self-attention-based graph contrastive learning recommendation algorithm(AttGCL).On the one hand,the integrated self-attention mechanism strengthens the connections between users and items,allowing the model to capture user preferences and individual differences more accurately.On the other hand,the ICL loss function effectively controls the importance of positive and negative samples,leading to better alignment between global and local representations.This method retains the essential semantics of user-item interactions,enabling the model to reflect user preferences more accurately and improve recommendation effectiveness.Experimental results on three real-world datasets show that AttGCL significantly outperforms existing graph contrastive learning methods in terms of performance,demonstrating its advantages in efficiency and robustness.

Key words: Recommendation system, Graph contrastive learning, Self-attention mechanism, Graph convolutional network, Con-trastive learning loss

中图分类号: 

  • TP391
[1]ALHARBE N,RAKROUKI M A,ALJOHANI A.A collaborative filtering recommendation algorithm based on embedding representation[J].Expert Systems with Applications,2023,215:119380.
[2]CHEN J,LIAN D,JIN B,et al.Fast Variational AutoEncoder with Inverted Multi-Index for Collaborative Filtering[C]//Proceedings of the ACM Web Conference 2022.New York:ACM,2022:1944-1954.
[3]WANG J,MEI H,LI K,et al.Collaborative filtering model of graph neural network based on random walk[J].Applied Scien-ces,2023,13(3):1786.
[4]HE L,WANG X,WANG D,et al.Simplifying Graph-based Collaborative Filtering for Recommendation[C]//Proceedings of the 16th ACM international Conference on Web Search and Data Mining.2023:60-68.
[5]WU S,SUN F,ZHANG W,et al.Graph neural networks in re-commender systems:a survey[J].ACM Computing Surveys,2022,55(5):97.
[6]LIANG X,CHEN T,CUI L,et al.Lightweight Embeddings for Graph Collaborative Filtering[C]//Proceedings of the 47th International ACM SIGIR Conference on Research and Development in Information Retrieval.2024:1296-1306.
[7]LI C,KOU Y,SHEN D,et al.Cross-grained neural collaborative filtering for recommendation[J].IEEE Access,2024,12:48853-48864.
[8]CHEN J,ZHOU J,MA L.Gnncl:A graph neural network recommendation model based on contrastive learning[J].Neural Processing Letters,2024,56(2):45.
[9]WANG Y,ZHOU K,MIAO R,et al.AdaGCL:Adaptive Sub-graph Contrastive Learning to Generalize Large-scale Graph Training[C]//Proceedings of the 31st ACM International Conference on Information & Knowledge Management.New York:ACM,2022:2046-2055.
[10]YU J,XIA X,CHEN T,et al.XSimGCL:Towards extremely simple graph contrastive learning for recommendation[J].IEEE Transactions on Knowledge and Data Engineering,2023,36(2):913-926.
[11]XIE X,SUN F,LIU Z,et al.Contrastive Learning for Sequential Recommendation[C]//2022 IEEE 38th International Confe-rence on Data Engineering(ICDE).IEEE,2022:1259-1273.
[12]TANG H,ZHAO G,WU Y,et al.Multisample-based contrastive loss for top-k recommendation[J].IEEE Transactions on Multimedia,2023,25:339-351.
[13]YANG L,WANG S,TAO Y,et al.Dgrec:Graph Neural Network for Recommendation with Diversified Embedding Generation[C]//Proceedings of the 16th ACM International Confe-rence on Web Search and Data Mining.2023:661-669.
[14]LI Y Y,HAO Y J,ZHAO P P,et al.Edge-enhanced global disentangled graph neural network for sequential recommendation[J].ACM Transactions on Knowledge Discovery from Data,2023,17(6):80.
[15]HE X N,DENG K,WANG X,et al.LightGCN:Simplifying andPowering Graph Convolution Network for Recommendation[C]//Proceedings of the 43rd International ACM SIGIR Conference on Research and Development in Information Retrieval.New York:ACM,2020:639-648.
[16]HU B,ZHOU N,ZHOU Q,et al.Diffnet:a learning to compare deep network for product recognition[J].IEEE Access,2020,8:19336-19344.
[17]HAMILTON W L,YING R,LESKOVEC J,et al.InductiveRepresentation Learning on Large Graphs[C]//Proceedings of the 31st International Conference on Neural Information Processing Systems.Curran Associates Inc.,2017:1025-1035.
[18]GONG X,YANG C,SHI C,et al.MA-GCL:Model augmentation tricks for graph contrastive learning[C]//Proceedings of the AAAI Conference on Artificial Intelligence.2023:4284-4292.
[19]YU J,XIA X,CHEN T,et al.XSimGCL:Towards extremely simple graph contrastive learning for recommendation[J].IEEE Transactions on Knowledge & Data Engineering,2024,36(2):913-926.
[20]WU J,WANG X,FENG F,et al.Self-supervised Graph Lear-ning for Recommendation[C]//Proceedings of the 44th International ACM SIGIR Conference on Research and Development in Information Retrieval.ACM,2021:726-735.
[21]WANG Y,ZHOU K,MIAO R,et al.AdaGCL:Adaptive Subgraph Contrastive Learning to Generalize Large-scale Graph Training[C]//Proceedings of the 31st ACM International Conference on Information & Knowledge Management.ACM,2022:2046-2055.
[22]CAI X,HUANG C,XIA L,et al.LightGCL:Simple Yet Effective Graph Contrastive Learning for Recommendation[C]//The 11th International Conference on Learning Representations.2023.
[23]XIA L,HUANG C,XU Y,et al.Hypergraph Contrastive Collaborative Filtering[C]//Proceedings of the 45th International ACM SIGIR Conference on Research and Development in Information Retrieval.ACM,2022:70-79.
[24]KIPF T N,WELLING M.Semi-Supervised Classification withGraph Convolutional Networks[C]//International Conference on Learning Representations.2017.
[25]YU J,YIN H,XIA X,et al.Are Graph Augmentations Necessary? Simple Graph Contrastive Learning for Recommendation[C]//Proceedings of the 45th International ACM SIGIR Conference on Research and Development in Information Retrieval.ACM,2022:1294-1303.
[26]OORD A,LI Y,VINYALS O.Representation learning with contrastive predictive coding[J].arXiv:1807.03748,2018.
[27]XIA L,HUANG C,ZHANG C,et al.Self-Supervised Hypergraph Transformer for Recommender Systems[C]//Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining.ACM,2022:2100-2109.
[28]HE X,LIAO L,ZHANG H,et al.Neural Collaborative Filtering[C]//Proceedings of the 26th International Conference on World Wide Web.International World Wide Web Conferences Steering Committee,2017:173-182.
[29]LIANG D,CHARLIN L,MCINERNEY J,et al.Modeling User Exposure in Recommendation[C]//Proceedings of the 25th International Conference on World Wide Web.International World Wide Web Conferences Steering Committee,2016:951-961.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!