Computer Science ›› 2021, Vol. 48 ›› Issue (1): 226-232.doi: 10.11896/jsjkx.191200098

• Artificial Intelligence • Previous Articles     Next Articles

Deep Interest Factorization Machine Network Based on DeepFM

WANG Rui-ping, JIA Zhen, LIU Chang, CHEN Ze-wei, LI Tian-rui   

  1. School of Information Science and Technology,Southwest Jiaotong University,Chengdu 611756,China
  • Received:2019-12-16 Revised:2020-05-17 Online:2021-01-15 Published:2021-01-15
  • About author:WANG Rui-ping,born in 1995,postgraduate.Her main research interests include recommendation algorithm and natural language processing.
    LI Tian-rui,born in 1969,Ph.D,professor,Ph.D supervisor,is a distinguished member of China Computer Federation.His main research interests include big data intelligence,rough sets and granular computing.
  • Supported by:
    National Key R&D Program of China(2017YFB1401400).

Abstract: The recommendation system can sort out and display the information that may be of interest from the mass of information according to users' preferences.As deep learning has achieved good results in multiple research fields,it has also begun to be applied to recommendation systems.However,the current recommendation ranking algorithms based on deep learning often use Embedding & MLP mode and can only obtain high-level feature interactions.In order to solve the problem that only high-order feature interaction can be obtained,DeepFM adds FM to the above mode,which can learn the low-order and high-order feature interaction end-to-end.But the DeepFM cannot express the diversity of user interests.In view of this,this paper proposes a Deep Interest Factorization Machine Network(DIFMN) by introducing the multi-head attention mechanism into DeepFM.DIFMN can adaptively learn the user representation according to the different items to be recommended,showing the diversity of user intere-sts.In addition,the model adds preference representations according to the type of user's historical behaviors,so that it can be applied not only to tasks that record only historical behaviors that the user likes,but also to tasks that record both historical beha-viors that the user likes and dislikes.This paper uses tensorflow-gpu to implement the algorithm,and performs comparative tests on two public datasets of Amazon(Electronics) and movieLen-20 m.Experiment results show that RelaImprimproves by 17.70% and 35.24% respectively compared to DeepFM,which validates the feasibility and effectiveness of the proposed method.

Key words: Recommendation algorithm, DeepFM, Multi-head attention mechanism, Deep learning, CTR prediction, User interest modeling

CLC Number: 

  • TP391
[1] MARZ N,WARREN J.Big Data:Principles and best practices of scalable realtime data systems[M].Manning Publications,2015.
[2] RICCI F,ROKACH L,SHAPIRA B.Introduction to Recom-mender Systems Handbook[M]//Recommender Systems Handbook.Boston:Springer,2011:1-35.
[3] YU C J,ZHUANG Y,WEI S C,et al.Field-aware factorization machines for CTR prediction[C]//Proceedings of the 10th ACM Conference on Recommender Systems.ACM,2016:43-50.
[4] COVINGTON P,ADAMS J,SARGIN E.Deep neural networks for youtube recommendations[C]//Proceedings of the 10th ACM Conference on Recommender Systems.ACM,2016:191-198.
[5] ZHOU G R,ZHU X Q,SONG C R,et al.Deep interest network for click-through rate prediction[C]//Proceedings of the 24th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining.ACM,2018:1059-1068.
[6] CHEN Q,ZHAO H,LI W,et al.Behavior Sequence Transformer for E-commerce Recommendation in Alibaba[J].arXiv:1905.06874,2019.
[7] ZHOU G R,MOU N,FAN Y,et al.Deep interest evolution network for click-through rate prediction[C]//Proceedings of the AAAI Conference on Artificial Intelligence.2019:5941-5948.
[8] CHENG H T,KOC L,HARMSEN J,et al.Wide & deep learning for recommender systems[C]//Proceedings of the 1st Workshop on Deep Learning for Recommender Systems.ACM,2016:7-10.
[9] GUO H F,TANG R M,YE Y M,et al.DeepFM:a factorization-machine based neural network for CTR prediction[C]//Proceedings of the 26th International Joint Conference on Artificial Intelligence.AAAI Press,2017:1725-1731.
[10] RENDLE S.Factorization machines[C]//Proceedings of 2010 IEEE International Conference on Data Mining.IEEE,2010:995-1000.
[11] VASWANI A,SHAZEER N,PARMAR N,et al.Attention isall you need[C]//Advances in Neural Information Processing Systems.2017:5998-6008.
[12] MCAULEY J,TARGETT C,SHI Q,et al.Image-based recommendations on styles and substitutes[C]//Proceedings of the 38th International ACM SIGIR Conference on Research and Development in Information Retrieval.ACM,2015:43-52.
[13] HE R,MCAULEY J.Ups and downs:Modeling the visual evolution of fashion trends with one-class collaborative filtering[C]//Proceedings of the 25th International Conference on World Wide Web.2016:507-517.
[14] HARPER F M,KONSTAN J A.The movielens datasets:His-tory and context[J].ACM Transactions on Interactive Intelligent Systems,2015,5(4):19.
[15] QU Y,CAI H,REN K,et al.Product-based neural networks for user response prediction[C]//Proceedings of 2016 IEEE 16th International Conference on Data Mining.2016:1149-1154.
[16] ZHU H,JIN J,TAN C,et al.Optimized cost per click in taobao display advertising[C]//Proceedings of the 23rd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining.ACM:2191-2200.
[17] YAN L,LI W J,XUE G R,et al.Coupled group lasso for web-scale ctr prediction in display advertising[C]//Proceedings of International Conference on Machine Learning.2014:802-810.
[18] RICHARDSON M,DOMINOWSKA E,RAGNO R.Predicting clicks:estimating the click-through rate for new ads[C]//Proceedings of the 16th International Conference on World Wide Web.ACM,2007:521-530.
[19] FENG YF,LV F Y,SHEN W C,et al.Deep session interest network for click-through rate prediction[C]//Proceedings of 28th International Joint Conference on Artificial Intelligence.2019.
[20] ZHU H,LI X,ZhANG P,et al.Learning tree-based deep model for recommender systems[C]//Proceedings of the 24th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining.2018:1079-1088.
[1] YU Wen-jia, DING Shi-fei. Conditional Generative Adversarial Network Based on Self-attention Mechanism [J]. Computer Science, 2021, 48(1): 241-246.
[2] TONG Xin, WANG Bin-jun, WANG Run-zheng, PAN Xiao-qin. Survey on Adversarial Sample of Deep Learning Towards Natural Language Processing [J]. Computer Science, 2021, 48(1): 258-267.
[3] DING Yu, WEI Hao, PAN Zhi-song, LIU Xin. Survey of Network Representation Learning [J]. Computer Science, 2020, 47(9): 52-59.
[4] HE Xin, XU Juan, JIN Ying-ying. Action-related Network:Towards Modeling Complete Changeable Action [J]. Computer Science, 2020, 47(9): 123-128.
[5] YE Ya-nan, CHI Jing, YU Zhi-ping, ZHAN Yu-liand ZHANG Cai-ming. Expression Animation Synthesis Based on Improved CycleGan Model and Region Segmentation [J]. Computer Science, 2020, 47(9): 142-149.
[6] DENG Liang, XU Geng-lin, LI Meng-jie, CHEN Zhang-jin. Fast Face Recognition Based on Deep Learning and Multiple Hash Similarity Weighting [J]. Computer Science, 2020, 47(9): 163-168.
[7] BAO Yu-xuan, LU Tian-liang, DU Yan-hui. Overview of Deepfake Video Detection Technology [J]. Computer Science, 2020, 47(9): 283-292.
[8] LIU Jun-liang, LI Xiao-guang. Techniques for Recommendation System:A Survey [J]. Computer Science, 2020, 47(7): 47-55.
[9] YUAN Ye, HE Xiao-ge, ZHU Ding-kun, WANG Fu-lee, XIE Hao-ran, WANG Jun, WEI Ming-qiang, GUO Yan-wen. Survey of Visual Image Saliency Detection [J]. Computer Science, 2020, 47(7): 84-91.
[10] WANG Wen-dao, WANG Run-ze, WEI Xin-lei, QI Yun-liang, MA Yi-de. Automatic Recognition of ECG Based on Stacked Bidirectional LSTM [J]. Computer Science, 2020, 47(7): 118-124.
[11] LIU Yan, WEN Jing. Complex Scene Text Detection Based on Attention Mechanism [J]. Computer Science, 2020, 47(7): 135-140.
[12] ZHANG Zhi-yang, ZHANG Feng-li, TAN Qi, WANG Rui-jin. Review of Information Cascade Prediction Methods Based on Deep Learning [J]. Computer Science, 2020, 47(7): 141-153.
[13] JIANG Wen-bin, FU Zhi, PENG Jing, ZHU Jian. 4Bit-based Gradient Compression Method for Distributed Deep Learning System [J]. Computer Science, 2020, 47(7): 220-226.
[14] CHEN Jin-yin, ZHANG Dun-Jie, LIN Xiang, XU Xiao-dong and ZHU Zi-ling. False Message Propagation Suppression Based on Influence Maximization [J]. Computer Science, 2020, 47(6A): 17-23.
[15] CHENG Zhe, BAI Qian, ZHANG Hao, WANG Shi-pu and LIANG Yu. Improving Hi-C Data Resolution with Deep Convolutional Neural Networks [J]. Computer Science, 2020, 47(6A): 70-74.
Full text



[1] SHI Wen-jun, WU Ji-gang and LUO Yu-chun. Fast and Efficient Scheduling Algorithms for Mobile Cloud Offloading[J]. Computer Science, 2018, 45(4): 94 -99 .
[2] LIU Bo-yi, TANG Xiang-yan and CHENG Jie-ren. Recognition Method for Corn Borer Based on Templates Matching in Muliple Growth Periods[J]. Computer Science, 2018, 45(4): 106 -111 .
[3] GENG Hai-jun, SHI Xin-gang, WANG Zhi-liang, YIN Xia and YIN Shao-ping. Energy-efficient Intra-domain Routing Algorithm Based on Directed Acyclic Graph[J]. Computer Science, 2018, 45(4): 112 -116 .
[4] ZHENG Xiu-lin, SONG Hai-yan and FU Yi-peng. Distinguishing Attack of MORUS-1280-128[J]. Computer Science, 2018, 45(4): 152 -156 .
[5] LIU Qin. Study on Data Quality Based on Constraint in Computer Forensics[J]. Computer Science, 2018, 45(4): 169 -172 .
[6] ZHANG Jing and ZHU Guo-bin. Hot Topic Discovery Research of Stack Overflow Programming Website Based on CBOW-LDA Topic Model[J]. Computer Science, 2018, 45(4): 208 -214 .
[7] WEN Jun-hao, SUN Guang-hui and LI Shun. Study on Matrix Factorization Recommendation Algorithm Based on User Clustering and Mobile Context[J]. Computer Science, 2018, 45(4): 215 -219 .
[8] JIA Wei, HUA Qing-yi, ZHANG Min-jun, CHEN Rui, JI Xiang and WANG Bo. Mobile Interface Pattern Clustering Algorithm Based on Improved Particle Swarm Optimization[J]. Computer Science, 2018, 45(4): 220 -226 .
[9] QU Zhong and ZHAO Cong-mei. Anti-occlusion Adaptive-scale Object Tracking Algorithm[J]. Computer Science, 2018, 45(4): 296 -300 .
[10] DAI Wen-jing, YUAN Jia-bin. Survey on Hidden Subgroup Problem[J]. Computer Science, 2018, 45(6): 1 -8 .