Computer Science ›› 2019, Vol. 46 ›› Issue (10): 32-38.doi: 10.11896/jsjkx.180901801

• Big Data & Data Science • Previous Articles     Next Articles

Predicting User’s Dynamic Preference Based on Embedding Learning

WEN Wen1, LIN Ze-tian1, CAI Rui-chu1, HAO Zhi-feng1,2, WANG Li-juan1   

  1. (School of Computers,Guangdong University of Technology,Guangzhou 510000,China)1
    (School of Mathematics and Big Data,Foshan University,Foshan,Guangdong 528000,China)2
  • Received:2018-09-27 Revised:2019-03-14 Online:2019-10-15 Published:2019-10-21

Abstract: Traditional methods for capturing user preferences mainly focus on user’s long-term preferences.However,user interests always change over time in real-world applications.As a result,how to capture user’s dynamic prefe-rences still remains a big challenge.This paper proposed an embedding-based approach for predicting user’s dynamic preferences.Firstly,an improved embedding method is used for learning the low-dimensional vector representations of items from user’s click sequences.Then,based on the learned item vectors and user’s short-term click behaviors,user’sdynamic preferences are obtained and used for predicting the next click.Experiments were conducted on two real-world datasets and the proposed method was compared with state-of-the-art methods.The results demonstrate the significant superiority of the proposed method in prediction accuracy compared with other algorithms.

Key words: Behavior prediction, Embedding, Temporal behaviors, User preferences, Word2vec

CLC Number: 

  • TP181
[1]XU H L,WU X,LI X D,et al.Comparison Study of Internet Recommendation System[J].Journal of Software,2009,20(2):350-362.(in Chinese)
许海玲,吴潇,李晓东,等.互联网推荐系统比较研究[J].软件学报,2009,20(2):350-362.
[2]ADOMAVICIUS G,TUZHILIN A.Context-Aware recommender systems [C]//Proceedings of the RecSys 2008.New York:ACM Press,2008:335-336.
[3]XIANG L,YUAN Q,ZHAO S,et al.Temporal recommendation on graphs via long-and short-term preference fusion[C]//Proceedings of the 16th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining.ACM,2010:723-732.
[4]LINDEN G,SMITH B,YORK J.Amazon.com recommenda-tions:Item-to-item collaborative filtering[J].IEEE Internet Computing,2003,7(1):76-80.
[5]DING Y,LI X.Time weight collaborative filtering[C]//Proceedings of the 14th ACM international conference on Information and knowledge management.ACM,2005:485-492.
[6]BARKAN O,KOENIGSTEIN N.Item2vec:neural item embedding for collaborative filtering[C]//2016 IEEE 26th International Workshop on Machine Learning for Signal Processing (MLSP).IEEE,2016:1-6.
[7]GRBOVIC M,RADOSAVLJEVIC V,DJURIC N,et al.E-com-merce in your inbox:Product recommendations at scale[C]//Proceedings of the 21th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining.ACM,2015:1809-1818.
[8]LIU J,DENG G.Link prediction in a user-object network based on time-weighted resource allocation[J].Physica A:Statistical Mechanics and its Applications,2009,388(17):3643-3650.
[9]YU J,SHEN Y,YANG Z.Topic-STG:Extending the session-based temporal graph approach for personalized tweet recommendation[C]//Proceedings of the 23rd International Confe-rence on World Wide Web.ACM,2014:413-414.
[10]NZEKO’O A J N,TCHUENTE M,LATAPY M.Time Weight Content-Based Extensions of Temporal Graphs for Personalized Recommendation[C]//WEBIST 2017-13th International Conference on Web Information Systems and Technologies.2017.
[11]KOREN Y.Collaborative filtering with temporal dynamics [C]//Proceedings of the 15th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining.ACM,2009:447-456.
[12]LIU N N,ZHAO M,XIANG E,et al.Online evolutionary collaborative filtering[C]//Proceedings of the fourth ACMConfe-rence on Recommender Systems.ACM,2010:95-102.
[13]MEI Q,ZHAI C X.Discovering evolutionary theme patterns from text:an exploration of temporal text mining[C]//Procee-dings of the Eleventh ACM SIGKDD International Conference on Knowledge Discovery in Data Mining.ACM,2005:198-207.
[14]WANG X,ZHAI C X,HU X,et al.Mining correlated bursty topic patterns from coordinated text streams[C]//Proceedings of the 13th ACM SIGKDD International Conference on Know-ledge Discovery and Data Mining.ACM,2007:784-793.
[15]BLEI D M,LAFFERTY J D.Dynamic topic models[C]//Proceedings of the 23rd International Conference on Machine Learning.ACM,2006:113-120.
[16]GOHR A,HINNEBURG A,SCHULT R,et al.Topic evolution in a stream of documents[C]//Proceedings of the 2009 SIAM International Conference on Data Mining.Society for Industrial and Applied Mathematics,2009:859-870.
[17]ALSUMAIT L,BARBARA D,DOMENICONI C.On-line lda:Adaptive topic models for mining text streams with applications to topic detection and tracking[C]//Eighth IEEE International Conference on Data Mining,2008(ICDM’08).IEEE,2008:3-12.
[18]DIAO Q,JIANG J,ZHU F,et al.Finding bursty topics from microblogs[C]//Proceedings of the 50th Annual Meeting of the Association for Computational Linguistics:Long Papers-Volume 1.Association for Computational Linguistics,2012:536-544.
[19]YIN H,CUI B,CHEN L,et al.A temporal context-aware model for user behavior modeling in social media systems[C]//Proceedings of the 2014 ACM SIGMOD International Conference on Management of Data.ACM,2014:1543-1554.
[20]YIN H,CUI B,CHEN L,et al.Dynamic user modeling in social media systems[J].ACM Transactions on Information Systems (TOIS),2015,33(3):10.
[21]BAEZA-YATES R,RIBEIRO-NETO B.Modern information retrieval[M].New York:ACM press,1999.
[22]LAVRENKO V,CROFT W B.Relevance-based language mo-dels[J].ACM SIGIR Forum,2017,51(2):260-267.
[23]TURIAN J,RATINOV L,BENGIO Y.Word representations:a simple and general method for semi-supervised learning[C]//Proceedings of the 48th annual meeting of the association for computational linguistics.Association for Computational Linguistics,2010:384-394.
[24]COLLOBERT R,WESTON J,BOTTOU L,et al.Natural language processing (almost) from scratch[J].Journal of Machine Learning Research,2011,12(8):2493-2537.
[25]MIKOLOV T,CHEN K,CORRADO G,et al.Efficient estimation of word representations in vector space[J].arXiv:1301.3781,2013.
[26]MIKOLOV T,SUTSKEVER I,CHEN K,et al.Distributed representations of words and phrases and their compositionality[M]//Advances in Neural Information Processing Systems.Berlin:Springer,2013:3111-3119.
[27]GOLDBERG Y,LEVY O.word2vec Explained:deriving Miko-lov et al.’s negative-sampling word-embedding method[J].arXiv:1402.3722,2014.
[28]RADEV D R,QI H,WU H,et al.Evaluating web-based question answering systems[C]//Proceedings of the 3rd International Conference on Language Resources and Evaluation.2002.
[29]CHOI K,SUH Y.A new similarity function for selecting neighbors for each target item in collaborative filtering[J].Know-ledge-Based Systems,2013,37(1):146-153.
[1] JIANG Meng-han, LI Shao-mei, ZHENG Hong-hao, ZHANG Jian-peng. Rumor Detection Model Based on Improved Position Embedding [J]. Computer Science, 2022, 49(8): 330-335.
[2] HOU Yu-tao, ABULIZI Abudukelimu, ABUDUKELIMU Halidanmu. Advances in Chinese Pre-training Models [J]. Computer Science, 2022, 49(7): 148-163.
[3] XIONG Luo-geng, ZHENG Shang, ZOU Hai-tao, YU Hua-long, GAO Shang. Software Self-admitted Technical Debt Identification with Bidirectional Gate Recurrent Unit and Attention Mechanism [J]. Computer Science, 2022, 49(7): 212-219.
[4] SHUAI Jian-bo, WANG Jin-ce, HUANG Fei-hu, PENG Jian. Click-Through Rate Prediction Model Based on Neural Architecture Search [J]. Computer Science, 2022, 49(7): 10-17.
[5] HAN Hong-qi, RAN Ya-xin, ZHANG Yun-liang, GUI Jie, GAO Xiong, YI Meng-lin. Study on Cross-media Information Retrieval Based on Common Subspace Classification Learning [J]. Computer Science, 2022, 49(5): 33-42.
[6] ZHONG Gui-feng, PANG Xiong-wen, SUI Dong. Text Classification Method Based on Word2Vec and AlexNet-2 with Improved AttentionMechanism [J]. Computer Science, 2022, 49(4): 288-293.
[7] LI Yong, WU Jing-peng, ZHANG Zhong-ying, ZHANG Qiang. Link Prediction for Node Featureless Networks Based on Faster Attention Mechanism [J]. Computer Science, 2022, 49(4): 43-48.
[8] YANG Hui, TAO Li-hong, ZHU Jian-yong, NIE Fei-ping. Fast Unsupervised Graph Embedding Based on Anchors [J]. Computer Science, 2022, 49(4): 116-123.
[9] CHEN Shi-cong, YUAN De-yu, HUANG Shu-hua, YANG Ming. Node Label Classification Algorithm Based on Structural Depth Network Embedding Model [J]. Computer Science, 2022, 49(3): 105-112.
[10] GUO Lei, MA Ting-huai. Friend Closeness Based User Matching [J]. Computer Science, 2022, 49(3): 113-120.
[11] YANG Xu-hua, WANG Lei, YE Lei, ZHANG Duan, ZHOU Yan-bo, LONG Hai-xia. Complex Network Community Detection Algorithm Based on Node Similarity and Network Embedding [J]. Computer Science, 2022, 49(3): 121-128.
[12] LI Yu-qiang, ZHANG Wei-jiang, HUANG Yu, LI Lin, LIU Ai-hua. Improved Topic Sentiment Model with Word Embedding Based on Gaussian Distribution [J]. Computer Science, 2022, 49(2): 256-264.
[13] LI Zhao-qi, LI Ta. Query-by-Example with Acoustic Word Embeddings Using wav2vec Pretraining [J]. Computer Science, 2022, 49(1): 59-64.
[14] CHEN Jin-peng, HU Ha-lei, ZHANG Fan, CAO Yuan, SUN Peng-fei. Convolutional Sequential Recommendation with Temporal Feature and User Preference [J]. Computer Science, 2022, 49(1): 115-120.
[15] LIU Kai, ZHANG Hong-jun, CHEN Fei-qiong. Name Entity Recognition for Military Based on Domain Adaptive Embedding [J]. Computer Science, 2022, 49(1): 292-297.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!