Computer Science ›› 2019, Vol. 46 ›› Issue (10): 55-62.doi: 10.11896/jsjkx.190300390

• Big Data & Data Science • Previous Articles     Next Articles

Deep Matrix Factorization Network for Matrix Completion

KUANG Shen-fen1,2, HUANG Ye-wen3, SONG Jie1, LI Qia2   

  1. (School of Mathematics and Statistics,Shaoguan University,Shaoguan,Guangdong 512005,China)1
    (School of Data and Computer Science,Sun Yat-sen University,Guangzhou 510006,China)2
    (School of Computer Engineering,Guangzhou College of South China University of Technology,Guangzhou 510800,China)3
  • Received:2019-02-01 Revised:2019-04-26 Online:2019-10-15 Published:2019-10-21

Abstract: Matrix factorization is a popular technique for matrix completion,but most of the existing methods are based on linear or shallow models,when the data matrix becomes large and the observation data is very small,it is prone to overfitting and the performance decreases significantly.To address this problem,this paper presented a Deep Matrix Factorization Network (DMFN) method,which can not only overcome the shortcoming of traditional matrix factorization,but also deal with complex non-linear data.First,by using rows and columns vectors corresponding to the observed values of the input matrices as input,the latent vector of its row (column) is projected,then the multi-layer perceptron network is constructed on the latent vector of row (column),at last the output vectors of rows and columns are fused by the bilinear pool layer.The proposed algorithm was tested on the standard recommender system dataset (MovieLens and Netflix).Experimental results show that the mean square error (RMSE) and mean absolute error (MAE) of the proposed method are significantly improved compared with the current popular methods under the same parameter setting.The RMSE is 0.830 and MAE is 0.652 on the MovieLens 1M dataset,which increases by about 2% and 6%,respectively.On the Netflix dataset,RMSE is 0.840 and MAE is 0.661,which increases by approximately 1% and 4%,respectively,and achieves optimal results.

Key words: Bilinear pooling, Deep learning, Matrix completion, Matrix factorization, Multilayer perception

CLC Number: 

  • TP18
[1]KOREN Y,BELL R M,VOLINSKY C,et al.Matrix factorization techniques for recommender systems[J].IEEE Computer,2009,42(8):30-37.
[2]PAN T T,WEN F,LIU Q R.Collaborative Filtering recommendation algorithm based on rating matrix filling and item predictability[J].Acta Automatica Sinica,2017,43(9):1597-1606.(in Chinese)
潘涛涛,文锋,刘勤让.基于矩阵填充和物品可预测性的协同过滤算法[J].自动化学报,2017,43(9):1597-1606.
[3]HU Y,ZHANG D,YE J,et al.Fast and Accurate Matrix completion via truncated nuclear norm regularization[J].IEEE Transactions on Pattern Analysis and Machine Intelligence,2013,35(9):2117-2130.
[4]WANG Y,LEE C M,CHEONG L F,et al.Practical Matrix completion and corruption recovery using proximal alternating robust subspace minimization[J].International Journal of Computer Vision,2015,111(3):315-344.
[5]CABRAL R S,LA TORRE F D,COSTEIRA J P,et al.Matrix completion for weakly-supervised multi-label Image classification[J].IEEE Transactions on Pattern Analysis and Machine Intelligence,2015,37(1):121-135.
[6]WEN Z,YIN W,ZHANG Y,et al.Solving a low-rank factorization model for matrix completion by a nonlinear successive over-relaxation algorithm[J].Mathematical Programming Computation,2012,4(4):333-361.
[7]JAIN P,NETRAPALLI P,SANGHAVI S,et al.Low-rank matrix completion using alternating minimization[C]//ACM Symposium on Theory of Computing.ACM,2013:665-674.
[8]MNIH A,SALAKHUTDINOV R.Probabilistic matrix Factorization[M]//Advances in Neural Information Processing Systems.Berlin:Springer,2007:1257-1264.
[9]CANDES E J,RECHT B.Exact matrix completion via convex optimization[J].Foundations of Computational Mathematics,2009,9(6):717-772.
[10]KRIZHEVSKY A,SUTSKEVER I,HINTON G E,et al.ImageNet classification with deep convolutional neural networks[C]//Advances in Neural Information Processing Systems,2012,141(5):1097-1105.
[11]FU W B,SUN T,LIANG J,et al.Review of principle and application of deep learning [J].Computer Science.2018,45(S1):11-15.(in Chinese)
付文博,孙涛,梁籍,等.深度学习原理及应用综述[J].计算机科学,2018,45(S1):11-15.
[12]WANG S,SUN G M,ZOU J Z,et al.Parallel collaborative filtering algorithm based on user recommended Influence[J].Computer Science,2017,44(9):250-255.(in Chinese)
王硕,孙光明,邹静昭,等.基于用户推荐影响度的并行协同过滤算法[J].计算机科学,2017,44(9):250-255.
[13]SALAKHUTDINOV R,MNIH A,HINTON G.Restricted Boltzmann machines for collaborative filtering[C]//Proceedings of the 24th International Conference on Machine Learning.ACM,2007:791-798.
[14]SEDHAIN S,MENON A K,SANNER S,et al.Autorec:Autoencoders meet collaborative filtering[C]//Proceedings of the 24th International Conference on World Wide Web.ACM,2015:111-112.
[15]LI X,SHE J.Collaborative variational autoencoder for recommender systems[C]//Proceedings of the 23rd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining.ACM,2017:305-314.
[16]WU C Y,AHMED A,BEUTEL A,et al.Recurrent recommender networks[C]//Proceedings of the Tenth ACM International Conference on Web Search and Data Mining.ACM,2017:495-503.
[17]KIM D,PARK C,OH J,et al.Convolutional matrix factorization for document context-aware recommendation[C]//Proceedings of the 10th ACM Conference on Recommender Systems.ACM,2016:233-240.
[18]XUE H J,DAI X,ZHANG J,et al.Deep Matrix Factorization Models for Recommender Systems[C]//Twenty-Sixth International Joint Conference on Artificial Intelligence.AAAI Press,2017:3203-3209.
[19]ZHANG S,YAO L,SUN A,et al.Deep learning based recommender system:A survey and new perspectives[J].ACM Computing Surveys,2019,52(1):1-38.
[20]CANDÉS E J,TAO T.The power of convex relaxation:Near-optimal matrix completion[J].IEEE Transactions on Information Theory,2010,56(5):2053-2080.
[21]SI S,CHIANG K Y,HSIEH C J,et al.Goal-directed inductive matrix completion[C]//Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining.ACM,2016:1165-1174.
[22]FAN J,CHOW T.Deep learning based matrix completion[J].Neurocomputing,2017,266(11):540-549.
[23]DZIUGAITE G K,ROY D M.Neural network matrix factorization[J].arXiv:1511.06443,2015.
[24]FAZEL M.Matrix rank minimization with applications[D].Palo Alto:Stanford University,2002.
[25]CAI J F,CANDÉS E J,SHEN Z.A singular value thresholding algorithm for matrix completion[J].SIAM Journal on Optimization,2010,20(4):1956-1982.
[26]LU C,TANG J,YAN S,et al.Generalized nonconvex nonsmooth low-rank minimization[C]//IEEE Conference on Computer Vision and Pattern Recognition.IEEE,2014:4130-4137.
[27]SREBRO N,RENNIE J,JAAKKOLA T S.Maximum-margin matrix factorization[M]//Advances in Neural Information Processing Systems.Berlin:Springer,2005:1329-1336.
[28]LIN Z,XU C,ZHA H.Robust matrix factorization by majorization minimization[J].IEEE Transactions on Pattern Analysis and Machine Intelligence,2018,40(1):208-220.
[29]BOUMAL N,ABSIL P.RTRMC:A Riemannian trust-region method for low-rank matrix completion[M]//Advances in Neural Information Processing Systems.Berlin:Springer,2011:406-414.
[30]TRIGEORGIS G,BOUSMALIS K,ZAFEIRIOU S,et al.A deep matrix factorization method for learning attribute representations[J].IEEE Transactions on Pattern Analysis and Machine Intelligence,2017,39(3):417-429.
[31]FAN J,CHENG J.Matrix completion by deep matrix factorization[J].Neural Networks,2018,98(2):34-41.
[32]HE X,LIAO L,ZHANG H,et al.Neural collaborative filtering[C]//Proceedings of the 26th International Conference on World Wide Web.International World Wide Web Conferences Steering Committee,2017:173-182.
[1] XU Yong-xin, ZHAO Jun-feng, WANG Ya-sha, XIE Bing, YANG Kai. Temporal Knowledge Graph Representation Learning [J]. Computer Science, 2022, 49(9): 162-171.
[2] RAO Zhi-shuang, JIA Zhen, ZHANG Fan, LI Tian-rui. Key-Value Relational Memory Networks for Question Answering over Knowledge Graph [J]. Computer Science, 2022, 49(9): 202-207.
[3] TANG Ling-tao, WANG Di, ZHANG Lu-fei, LIU Sheng-yun. Federated Learning Scheme Based on Secure Multi-party Computation and Differential Privacy [J]. Computer Science, 2022, 49(9): 297-305.
[4] SUN Qi, JI Gen-lin, ZHANG Jie. Non-local Attention Based Generative Adversarial Network for Video Abnormal Event Detection [J]. Computer Science, 2022, 49(8): 172-177.
[5] WANG Jian, PENG Yu-qi, ZHAO Yu-fei, YANG Jian. Survey of Social Network Public Opinion Information Extraction Based on Deep Learning [J]. Computer Science, 2022, 49(8): 279-293.
[6] HAO Zhi-rong, CHEN Long, HUANG Jia-cheng. Class Discriminative Universal Adversarial Attack for Text Classification [J]. Computer Science, 2022, 49(8): 323-329.
[7] JIANG Meng-han, LI Shao-mei, ZHENG Hong-hao, ZHANG Jian-peng. Rumor Detection Model Based on Improved Position Embedding [J]. Computer Science, 2022, 49(8): 330-335.
[8] QI Xiu-xiu, WANG Jia-hao, LI Wen-xiong, ZHOU Fan. Fusion Algorithm for Matrix Completion Prediction Based on Probabilistic Meta-learning [J]. Computer Science, 2022, 49(7): 18-24.
[9] HU Yan-yu, ZHAO Long, DONG Xiang-jun. Two-stage Deep Feature Selection Extraction Algorithm for Cancer Classification [J]. Computer Science, 2022, 49(7): 73-78.
[10] CHENG Cheng, JIANG Ai-lian. Real-time Semantic Segmentation Method Based on Multi-path Feature Extraction [J]. Computer Science, 2022, 49(7): 120-126.
[11] HOU Yu-tao, ABULIZI Abudukelimu, ABUDUKELIMU Halidanmu. Advances in Chinese Pre-training Models [J]. Computer Science, 2022, 49(7): 148-163.
[12] ZHOU Hui, SHI Hao-chen, TU Yao-feng, HUANG Sheng-jun. Robust Deep Neural Network Learning Based on Active Sampling [J]. Computer Science, 2022, 49(7): 164-169.
[13] SU Dan-ning, CAO Gui-tao, WANG Yan-nan, WANG Hong, REN He. Survey of Deep Learning for Radar Emitter Identification Based on Small Sample [J]. Computer Science, 2022, 49(7): 226-235.
[14] ZHU Wen-tao, LAN Xian-chao, LUO Huan-lin, YUE Bing, WANG Yang. Remote Sensing Aircraft Target Detection Based on Improved Faster R-CNN [J]. Computer Science, 2022, 49(6A): 378-383.
[15] WANG Jian-ming, CHEN Xiang-yu, YANG Zi-zhong, SHI Chen-yang, ZHANG Yu-hang, QIAN Zheng-kun. Influence of Different Data Augmentation Methods on Model Recognition Accuracy [J]. Computer Science, 2022, 49(6A): 418-423.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!