Computer Science ›› 2019, Vol. 46 ›› Issue (10): 55-62.doi: 10.11896/jsjkx.190300390

• Big Data & Data Science • Previous Articles     Next Articles

Deep Matrix Factorization Network for Matrix Completion

KUANG Shen-fen1,2, HUANG Ye-wen3, SONG Jie1, LI Qia2   

  1. (School of Mathematics and Statistics,Shaoguan University,Shaoguan,Guangdong 512005,China)1
    (School of Data and Computer Science,Sun Yat-sen University,Guangzhou 510006,China)2
    (School of Computer Engineering,Guangzhou College of South China University of Technology,Guangzhou 510800,China)3
  • Received:2019-02-01 Revised:2019-04-26 Online:2019-10-15 Published:2019-10-21

Abstract: Matrix factorization is a popular technique for matrix completion,but most of the existing methods are based on linear or shallow models,when the data matrix becomes large and the observation data is very small,it is prone to overfitting and the performance decreases significantly.To address this problem,this paper presented a Deep Matrix Factorization Network (DMFN) method,which can not only overcome the shortcoming of traditional matrix factorization,but also deal with complex non-linear data.First,by using rows and columns vectors corresponding to the observed values of the input matrices as input,the latent vector of its row (column) is projected,then the multi-layer perceptron network is constructed on the latent vector of row (column),at last the output vectors of rows and columns are fused by the bilinear pool layer.The proposed algorithm was tested on the standard recommender system dataset (MovieLens and Netflix).Experimental results show that the mean square error (RMSE) and mean absolute error (MAE) of the proposed method are significantly improved compared with the current popular methods under the same parameter setting.The RMSE is 0.830 and MAE is 0.652 on the MovieLens 1M dataset,which increases by about 2% and 6%,respectively.On the Netflix dataset,RMSE is 0.840 and MAE is 0.661,which increases by approximately 1% and 4%,respectively,and achieves optimal results.

Key words: Matrix completion, Matrix factorization, Deep learning, Bilinear pooling, Multilayer perception

CLC Number: 

  • TP18
[1]KOREN Y,BELL R M,VOLINSKY C,et al.Matrix factorization techniques for recommender systems[J].IEEE Computer,2009,42(8):30-37.
[2]PAN T T,WEN F,LIU Q R.Collaborative Filtering recommendation algorithm based on rating matrix filling and item predictability[J].Acta Automatica Sinica,2017,43(9):1597-1606.(in Chinese)
潘涛涛,文锋,刘勤让.基于矩阵填充和物品可预测性的协同过滤算法[J].自动化学报,2017,43(9):1597-1606.
[3]HU Y,ZHANG D,YE J,et al.Fast and Accurate Matrix completion via truncated nuclear norm regularization[J].IEEE Transactions on Pattern Analysis and Machine Intelligence,2013,35(9):2117-2130.
[4]WANG Y,LEE C M,CHEONG L F,et al.Practical Matrix completion and corruption recovery using proximal alternating robust subspace minimization[J].International Journal of Computer Vision,2015,111(3):315-344.
[5]CABRAL R S,LA TORRE F D,COSTEIRA J P,et al.Matrix completion for weakly-supervised multi-label Image classification[J].IEEE Transactions on Pattern Analysis and Machine Intelligence,2015,37(1):121-135.
[6]WEN Z,YIN W,ZHANG Y,et al.Solving a low-rank factorization model for matrix completion by a nonlinear successive over-relaxation algorithm[J].Mathematical Programming Computation,2012,4(4):333-361.
[7]JAIN P,NETRAPALLI P,SANGHAVI S,et al.Low-rank matrix completion using alternating minimization[C]//ACM Symposium on Theory of Computing.ACM,2013:665-674.
[8]MNIH A,SALAKHUTDINOV R.Probabilistic matrix Factorization[M]//Advances in Neural Information Processing Systems.Berlin:Springer,2007:1257-1264.
[9]CANDES E J,RECHT B.Exact matrix completion via convex optimization[J].Foundations of Computational Mathematics,2009,9(6):717-772.
[10]KRIZHEVSKY A,SUTSKEVER I,HINTON G E,et al.ImageNet classification with deep convolutional neural networks[C]//Advances in Neural Information Processing Systems,2012,141(5):1097-1105.
[11]FU W B,SUN T,LIANG J,et al.Review of principle and application of deep learning [J].Computer Science.2018,45(S1):11-15.(in Chinese)
付文博,孙涛,梁籍,等.深度学习原理及应用综述[J].计算机科学,2018,45(S1):11-15.
[12]WANG S,SUN G M,ZOU J Z,et al.Parallel collaborative filtering algorithm based on user recommended Influence[J].Computer Science,2017,44(9):250-255.(in Chinese)
王硕,孙光明,邹静昭,等.基于用户推荐影响度的并行协同过滤算法[J].计算机科学,2017,44(9):250-255.
[13]SALAKHUTDINOV R,MNIH A,HINTON G.Restricted Boltzmann machines for collaborative filtering[C]//Proceedings of the 24th International Conference on Machine Learning.ACM,2007:791-798.
[14]SEDHAIN S,MENON A K,SANNER S,et al.Autorec:Autoencoders meet collaborative filtering[C]//Proceedings of the 24th International Conference on World Wide Web.ACM,2015:111-112.
[15]LI X,SHE J.Collaborative variational autoencoder for recommender systems[C]//Proceedings of the 23rd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining.ACM,2017:305-314.
[16]WU C Y,AHMED A,BEUTEL A,et al.Recurrent recommender networks[C]//Proceedings of the Tenth ACM International Conference on Web Search and Data Mining.ACM,2017:495-503.
[17]KIM D,PARK C,OH J,et al.Convolutional matrix factorization for document context-aware recommendation[C]//Proceedings of the 10th ACM Conference on Recommender Systems.ACM,2016:233-240.
[18]XUE H J,DAI X,ZHANG J,et al.Deep Matrix Factorization Models for Recommender Systems[C]//Twenty-Sixth International Joint Conference on Artificial Intelligence.AAAI Press,2017:3203-3209.
[19]ZHANG S,YAO L,SUN A,et al.Deep learning based recommender system:A survey and new perspectives[J].ACM Computing Surveys,2019,52(1):1-38.
[20]CANDÉS E J,TAO T.The power of convex relaxation:Near-optimal matrix completion[J].IEEE Transactions on Information Theory,2010,56(5):2053-2080.
[21]SI S,CHIANG K Y,HSIEH C J,et al.Goal-directed inductive matrix completion[C]//Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining.ACM,2016:1165-1174.
[22]FAN J,CHOW T.Deep learning based matrix completion[J].Neurocomputing,2017,266(11):540-549.
[23]DZIUGAITE G K,ROY D M.Neural network matrix factorization[J].arXiv:1511.06443,2015.
[24]FAZEL M.Matrix rank minimization with applications[D].Palo Alto:Stanford University,2002.
[25]CAI J F,CANDÉS E J,SHEN Z.A singular value thresholding algorithm for matrix completion[J].SIAM Journal on Optimization,2010,20(4):1956-1982.
[26]LU C,TANG J,YAN S,et al.Generalized nonconvex nonsmooth low-rank minimization[C]//IEEE Conference on Computer Vision and Pattern Recognition.IEEE,2014:4130-4137.
[27]SREBRO N,RENNIE J,JAAKKOLA T S.Maximum-margin matrix factorization[M]//Advances in Neural Information Processing Systems.Berlin:Springer,2005:1329-1336.
[28]LIN Z,XU C,ZHA H.Robust matrix factorization by majorization minimization[J].IEEE Transactions on Pattern Analysis and Machine Intelligence,2018,40(1):208-220.
[29]BOUMAL N,ABSIL P.RTRMC:A Riemannian trust-region method for low-rank matrix completion[M]//Advances in Neural Information Processing Systems.Berlin:Springer,2011:406-414.
[30]TRIGEORGIS G,BOUSMALIS K,ZAFEIRIOU S,et al.A deep matrix factorization method for learning attribute representations[J].IEEE Transactions on Pattern Analysis and Machine Intelligence,2017,39(3):417-429.
[31]FAN J,CHENG J.Matrix completion by deep matrix factorization[J].Neural Networks,2018,98(2):34-41.
[32]HE X,LIAO L,ZHANG H,et al.Neural collaborative filtering[C]//Proceedings of the 26th International Conference on World Wide Web.International World Wide Web Conferences Steering Committee,2017:173-182.
[1] WANG Rui-ping, JIA Zhen, LIU Chang, CHEN Ze-wei, LI Tian-rui. Deep Interest Factorization Machine Network Based on DeepFM [J]. Computer Science, 2021, 48(1): 226-232.
[2] YU Wen-jia, DING Shi-fei. Conditional Generative Adversarial Network Based on Self-attention Mechanism [J]. Computer Science, 2021, 48(1): 241-246.
[3] TONG Xin, WANG Bin-jun, WANG Run-zheng, PAN Xiao-qin. Survey on Adversarial Sample of Deep Learning Towards Natural Language Processing [J]. Computer Science, 2021, 48(1): 258-267.
[4] DING Yu, WEI Hao, PAN Zhi-song, LIU Xin. Survey of Network Representation Learning [J]. Computer Science, 2020, 47(9): 52-59.
[5] HE Xin, XU Juan, JIN Ying-ying. Action-related Network:Towards Modeling Complete Changeable Action [J]. Computer Science, 2020, 47(9): 123-128.
[6] YE Ya-nan, CHI Jing, YU Zhi-ping, ZHAN Yu-liand ZHANG Cai-ming. Expression Animation Synthesis Based on Improved CycleGan Model and Region Segmentation [J]. Computer Science, 2020, 47(9): 142-149.
[7] DENG Liang, XU Geng-lin, LI Meng-jie, CHEN Zhang-jin. Fast Face Recognition Based on Deep Learning and Multiple Hash Similarity Weighting [J]. Computer Science, 2020, 47(9): 163-168.
[8] BAO Yu-xuan, LU Tian-liang, DU Yan-hui. Overview of Deepfake Video Detection Technology [J]. Computer Science, 2020, 47(9): 283-292.
[9] LI Yu-rong, LIU Jie, LIU Ya-lin, GONG Chun-ye, WANG Yong. Parallel Algorithm of Deep Transductive Non-negative Matrix Factorization for Speech Separation [J]. Computer Science, 2020, 47(8): 49-55.
[10] LIU Jun-liang, LI Xiao-guang. Techniques for Recommendation System:A Survey [J]. Computer Science, 2020, 47(7): 47-55.
[11] LI Xiang-li, JIA Meng-xue. Nonnegative Matrix Factorization Algorithm with Hypergraph Based on Per-treatments [J]. Computer Science, 2020, 47(7): 71-77.
[12] YUAN Ye, HE Xiao-ge, ZHU Ding-kun, WANG Fu-lee, XIE Hao-ran, WANG Jun, WEI Ming-qiang, GUO Yan-wen. Survey of Visual Image Saliency Detection [J]. Computer Science, 2020, 47(7): 84-91.
[13] WANG Wen-dao, WANG Run-ze, WEI Xin-lei, QI Yun-liang, MA Yi-de. Automatic Recognition of ECG Based on Stacked Bidirectional LSTM [J]. Computer Science, 2020, 47(7): 118-124.
[14] LIU Yan, WEN Jing. Complex Scene Text Detection Based on Attention Mechanism [J]. Computer Science, 2020, 47(7): 135-140.
[15] ZHANG Zhi-yang, ZHANG Feng-li, TAN Qi, WANG Rui-jin. Review of Information Cascade Prediction Methods Based on Deep Learning [J]. Computer Science, 2020, 47(7): 141-153.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
[1] LEI Li-hui and WANG Jing. Parallelization of LTL Model Checking Based on Possibility Measure[J]. Computer Science, 2018, 45(4): 71 -75 .
[2] SUN Qi, JIN Yan, HE Kun and XU Ling-xuan. Hybrid Evolutionary Algorithm for Solving Mixed Capacitated General Routing Problem[J]. Computer Science, 2018, 45(4): 76 -82 .
[3] ZHANG Jia-nan and XIAO Ming-yu. Approximation Algorithm for Weighted Mixed Domination Problem[J]. Computer Science, 2018, 45(4): 83 -88 .
[4] WU Jian-hui, HUANG Zhong-xiang, LI Wu, WU Jian-hui, PENG Xin and ZHANG Sheng. Robustness Optimization of Sequence Decision in Urban Road Construction[J]. Computer Science, 2018, 45(4): 89 -93 .
[5] SHI Wen-jun, WU Ji-gang and LUO Yu-chun. Fast and Efficient Scheduling Algorithms for Mobile Cloud Offloading[J]. Computer Science, 2018, 45(4): 94 -99 .
[6] ZHOU Yan-ping and YE Qiao-lin. L1-norm Distance Based Least Squares Twin Support Vector Machine[J]. Computer Science, 2018, 45(4): 100 -105 .
[7] LIU Bo-yi, TANG Xiang-yan and CHENG Jie-ren. Recognition Method for Corn Borer Based on Templates Matching in Muliple Growth Periods[J]. Computer Science, 2018, 45(4): 106 -111 .
[8] GENG Hai-jun, SHI Xin-gang, WANG Zhi-liang, YIN Xia and YIN Shao-ping. Energy-efficient Intra-domain Routing Algorithm Based on Directed Acyclic Graph[J]. Computer Science, 2018, 45(4): 112 -116 .
[9] CUI Qiong, LI Jian-hua, WANG Hong and NAN Ming-li. Resilience Analysis Model of Networked Command Information System Based on Node Repairability[J]. Computer Science, 2018, 45(4): 117 -121 .
[10] WANG Zhen-chao, HOU Huan-huan and LIAN Rui. Path Optimization Scheme for Restraining Degree of Disorder in CMT[J]. Computer Science, 2018, 45(4): 122 -125 .