Please wait a minute...
Journal of ZheJiang University (Engineering Science)  2023, Vol. 57 Issue (2): 243-251    DOI: 10.3785/j.issn.1008-973X.2023.02.004
    
Graph convolution collaborative filtering model combining graph enhancement and sampling strategies
Jing-jing ZHANG(),Zhao-gong ZHANG*(),Xin XU
School of Computer Science and Technology, Heilongjiang University, Harbin 150080, China
Download: HTML     PDF(1156KB) HTML
Export: BibTeX | EndNote (RIS)      

Abstract  

There are two significant problems with existing collaborative filtering (CF) models based on graph convolutional networks (GCNs). Most original graphs have noise and data sparsity problems that can seriously impair the model performance. In addition, for large user project graphs, the explicit message passing in traditional GCNs slows down the convergence speed during training and weakens the training efficiency of the model. A graph convolution collaborative filtering model combing graph enhancement and sampling strategies (EL-GCCF) was proposed to respond to the above two points. In the graph initialization learning module, the structural information and the feature information in the graph were integrated by generating two graph structures. The original graph was enhanced and the noise problem was effectively mitigated. Explicit message passing was skipped because of the multi-task constrained graph convolution. The over-smoothing problem in training was effectively mitigated and the training efficiency of the model was improved by using an auxiliary sampling strategy. Experimental results on two real datasets show that the EL-GCCF model outperforms many mainstream models and has higher training efficiency.



Key wordsrecommendation system      collaborative filtering      graph enhancement      graph convolutional network      graph neural network     
Received: 31 July 2022      Published: 28 February 2023
CLC:  TP 399  
Fund:  国家自然科学基金资助项目(61972135)
Corresponding Authors: Zhao-gong ZHANG     E-mail: 2201851@s.hlju.edu.cn;2013010@hlju.edu.cn
Cite this article:

Jing-jing ZHANG,Zhao-gong ZHANG,Xin XU. Graph convolution collaborative filtering model combining graph enhancement and sampling strategies. Journal of ZheJiang University (Engineering Science), 2023, 57(2): 243-251.

URL:

https://www.zjujournals.com/eng/10.3785/j.issn.1008-973X.2023.02.004     OR     https://www.zjujournals.com/eng/Y2023/V57/I2/243


融合图增强和采样策略的图卷积协同过滤模型

现有的基于图卷积网络(GCNs)的协同过滤(CF)模型存在两大问题,大多数原始图因存在噪声及数据稀疏问题会严重损害模型性能;对于大型用户项目图来说,传统GCN中的显式消息传递减慢了训练时的收敛速度,削弱了模型的训练效率. 针对上述2点,提出融合图增强和采样策略的图卷积协同过滤模型(EL-GCCF). 图初始化学习模块通过生成2种图结构,综合考虑图中的结构和特征信息,对原始图进行增强,有效缓解了噪声问题. 通过多任务的约束图卷积跳过显式的消息传递,利用辅助采样策略有效缓解训练中的过度平滑问题,提高了模型的训练效率. 在2个真实数据集上的实验结果表明,EL-GCCF模型的性能优于众多主流模型,并且具有更高的训练效率.


关键词: 推荐系统,  协同过滤,  图增强,  图卷积网络,  图神经网络 
Fig.1 EL-GCCF framework diagram
Fig.2 Graph initialization learning module
数据集 nu ni nc spa/%
MovieLens-1M 6022 3043 995154 5.431
Amazon-Books 52643 91599 2984108 0.062
Tab.1 Parameter of MovieLens-1M and Amazon-Book datasets
模型 Amazon-Books MovieLens-1M
Recall@10 NDCG@10 Recall@20 NDCG@20 Recall@10 NDCG@10 Recall@20 NDCG@20
MF-BPR 0.0607 0.0430 0.0956 0.0536 0.1704 0.2044 0.2153 0.2175
NeuMF 0.0507 0.0351 0.0823 0.0447 0.1657 0.1953 0.2106 0.2067
DeepWalk 0.0286 0.02511 0.0346 0.0264 0.1248 0.1025 0.1348 0.1057
Node2Vec 0.0301 0.2936 0.0402 0.0309 0.1347 0.1095 0.1475 0.1186
NGCF 0.0617 0.0427 0.0978 0.0547 0.1846 0.2328 0.2513 0.2511
LightGCN 0.0797 0.0565 0.1206 0.0689 0.1876 0.2314 0.2576 0.2427
LR-GCCF 0.0591 0.0504 0.1135 0.0558 0.1785 0.2051 0.2231 0.2124
EL-GCCF 0.0973 0.0643 0.1363 0.0768 0.1925 0.2636 0.2657 0.2882
Imp/% 64.64 27.58 20.01 37.63 7.84 28.52 19.09 35.69
Tab.2 Performance comparison of EL-GCCF and other methods on two datasets
模型 te/s Etrain Ttrain
MF-BPR 31 23 12 min
NeuMF 125 79 2 h 45 min
LightGCN 51 780 11 h 3 min
LR-GCCF 67 165 3 h 5 min
EL-GCCF 36 70 43 min
Tab.3 Comparison of efficiency of EL-GCCF model and MF model on Ml-1M dataset
Fig.3 Performance of EL-GCCF and its variant models on Ml-1M dataset
模型 Amazon-Books MovieLens-1M
Recall@20 NDCG@20 Recall@20 NDCG@20
EL-GCCF(null) 0.1135 0.0558 0.2231 0.2124
EL-GCCF( $\alpha $) 0.1162 0.0583 0.2390 0.2325
EL-GCCF(β) 0.0942 0.0373 0.2187 0.2037
EL-GCCF(n_s) 0.1278 0.0688 0.2633 0.2742
EL-GCCF 0.1363 0.0768 0.2657 0.2882
Tab.4 Performance of EL-GCCF and its variant models on two datasets
Fig.4 Effect of different number of neighbors on EL-GCCF model performance
Fig.5 Effect of different hyperparameters on EL-GCCF model performance
[1]   WANG X, HE X, WANG M, et al. Neural graph collaborative filtering [C]// Proceedings of the 42nd International ACM SIGIR Conference on Research and Development in Information Retrieval. Paris: ACM, 2019: 165-174.
[2]   SUN J, ZHANG Y, GUO W, et al. Neighbor interaction aware graph convolution networks for recommendation [C]// Proceedings of the 43rd International ACM SIGIR Conference on Research and Development in Information Retrieval. Xi'an: ACM, 2020: 1289-1298.
[3]   VERMA V, QU M, KAWAGUCHI K, et al. Graphmix: improved training of gnns for semi-supervised learning [C]// Proceedings of the AAAI Conference on Artificial Intelligence. Vancouver : ACM, 2021, 35(11): 10024-10032.
[4]   HE X, DENG K, WANG X, et al. Lightgcn: simplifying and powering graph convolution network for recommendation [C]// Proceedings of the 43rd International ACM SIGIR Conference on Research and Development in Information Retrieval. Xi'an: ACM, 2020: 639-648.
[5]   CHEN L, WU L, HONG R, et al. Revisiting graph based collaborative filtering: a linear residual graph convolutional network approach [C]// Proceedings of the AAAI Conference on Artificial Intelligence. New York: ACM, 2020, 34(1): 27-34.
[6]   DONG X, THANOU D, RABBAT M, et al Learning graphs from data: a signal representation perspective[J]. IEEE Signal Processing Magazine, 2019, 36 (3): 44- 63
doi: 10.1109/MSP.2018.2887284
[7]   孔欣欣, 苏本昌, 王宏志, 等 基于标签权重评分的推荐模型及算法研究[J]. 计算机学报, 2017, 40 (6): 1440- 1452
KONG Xin-xin, SU Ben-chang, WANG Hong-zhi, et al Research on recommendation model and algorithm based on tag weight scoring[J]. Journal of Computer Science, 2017, 40 (6): 1440- 1452
doi: 10.11897/SP.J.1016.2017.01440
[8]   FRANCESCHI L, NIEPERT M, PONTIL M, et al. Learning discrete structures for graph neural networks [C]// International Conference on Machine Learning. Los Angeles: ACM, 2019: 1972-1982.
[9]   ZHANG Y, PAL S, COATES M, et al. Bayesian graph convolutional neural networks for semi-supervised classification [C]// Proceedings of the AAAI Conference on Artificial Intelligence. Hawaii: ACM, 2019: 5829-5836.
[10]   WANG W, LUO J, SHEN C, et al. A graph convolutional matrix completion method for miRNA-disease association prediction [C]// International Conference on Intelligent Computing. Bari: IEEE, 2020: 201-215.
[11]   YU W, QIN Z. Graph convolutional network for recommendation with low-pass collaborative filters [C]// International Conference on Machine Learning. Vienna: ACM, 2020: 10936-10945.
[12]   CHEN M, WEI Z, HUANG Z, et al. Simple and deep graph convolutional networks [C]// International Conference on Machine Learning. Vienna: ACM, 2020: 1725-1735.
[13]   YING R, HE R, CHEN K, et al. Graph convolutional neural networks for web-scale recommender systems [C]// Proceedings of the 24th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. London: ACM, 2018: 974-983.
[14]   WU F, SOUZA A, ZHANG T, et al. Simplifying graph convolutional networks [C]// International Conference on Machine Learning. Los Angeles: ACM, 2019: 6861-6871.
[15]   ZEILER M D, KRISHNAN D, TAYLOR G W, et al. Deconvolutional networks [C]// Computer Society Conference on Computer Vision and Pattern Recognition. San Francisco: IEEE, 2010: 2528-2535. .
[16]   LUO D, CHENG W, YU W, et al. Learning to drop: robust graph neural network via topological denoising [C]// Proceedings of the 14th ACM International Conference on Web Search and Data Mining. Jerusalem: ACM, 2021: 779-787.
[17]   CHEN Y, WU L. Graph neural networks: graph structure learning [M]// Graph neural networks: foundations, frontiers, and applications. Singapore: Springer, 2022: 297-321.
[18]   LERCHE L, JANNACH D. Using graded implicit feedback for bayesian personalized ranking [C]// Proceedings of the 8th ACM Conference on Recommender Systems. Silicon Valley: ACM, 2014: 353-356.
[19]   HE X, LIAO L, ZHANG H, et al. Neural collaborative filtering [C]// Proceedings of the 26th International Conference on World Wide Web. Perth: ACM, 2017: 173-182.
[20]   PEROZZI B, AL-RFOU R, SKIENA S. Deepwalk: online learning of social representations [C]// Proceedings of the 20th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. New York : ACM, 2014: 701-710.
[1] Ju-xiang ZENG,Ping-hui WANG,Yi-dong DING,Lin LAN,Lin-xi CAI,Xiao-hong GUAN. Graph neural network based node embedding enhancement model for node classification[J]. Journal of ZheJiang University (Engineering Science), 2023, 57(2): 219-225.
[2] Yue HOU,Cheng-yan HAN,Xin ZHENG,Zhi-yuan DENG. Traffic flow data repair method based on spatial-temporal fusion graph convolution[J]. Journal of ZheJiang University (Engineering Science), 2022, 56(7): 1394-1403.
[3] Ce GUO,Zhi-wen ZENG,Peng-ming ZHU,Zhi-qian ZHOU,Hui-min LU. Decentralized swarm control based on graph convolutional imitation learning[J]. Journal of ZheJiang University (Engineering Science), 2022, 56(6): 1055-1061.
[4] You-wei WANG,Shuang TONG,Li-zhou FENG,Jian-ming ZHU,Yang LI,Fu CHEN. New inductive microblog rumor detection method based on graph convolutional network[J]. Journal of ZheJiang University (Engineering Science), 2022, 56(5): 956-966.
[5] Huang-he ZHENG,Zhi-qiu HUANG,Wei-wei LI,Yao-shen YU,Yong-chao WANG. API recommendation method based on natural nearest neighbors and collaborative filtering[J]. Journal of ZheJiang University (Engineering Science), 2022, 56(3): 494-502.
[6] Ting WANG,Xiao-fei ZHU,Gu TANG. Knowledge-enhanced graph convolutional neural networks for text classification[J]. Journal of ZheJiang University (Engineering Science), 2022, 56(2): 322-328.
[7] Jia-hui XU,Jing-chang WANG,Ling CHEN,Yong WU. Surface water quality prediction model based on graph neural network[J]. Journal of ZheJiang University (Engineering Science), 2021, 55(4): 601-607.
[8] Nuo LI,Bin GUO,Yan LIU,Yao JING,Zhi-wen YU. Intelligent commercial site recommendation with neural collaborative filtering[J]. Journal of ZheJiang University (Engineering Science), 2019, 53(9): 1788-1794.
[9] Li-yan DONG,Jia-huan JIN,Yuan-cheng FANG,Yue-qun WANG,Yong-li LI,Ming-hui SUN. Slope One algorithm based on nonnegative matrix factorization[J]. Journal of ZheJiang University (Engineering Science), 2019, 53(7): 1349-1353.
[10] Hong-xia WANG,Jian CHEN,Yan-fen CHENG. Improved collaborative filtering algorithm to revise users' rating by review mining[J]. Journal of ZheJiang University (Engineering Science), 2019, 53(3): 522-532.
[11] Xiao-jun LI,Hong LIU,Han-xiao SHI,Liu-qing ZHU,Ya-hui ZHANG. Deep learning based course recommendation model[J]. Journal of ZheJiang University (Engineering Science), 2019, 53(11): 2139-2145.
[12] Si CHEN,Xiao-dong CAI,Zhen-zhen HOU,Bo LI. Aggregate graph embedding method based on non-uniform neighbor nodes sampling[J]. Journal of ZheJiang University (Engineering Science), 2019, 53(11): 2163-2167.
[13] YUAN You-wei-, YU Jia, ZHENG Hong-sheng, WANG Jiao-jiao. Cloud workflow scheduling algorithm based on novelty ranking and multi-quality of service[J]. Journal of ZheJiang University (Engineering Science), 2017, 51(6): 1190-1196.
[14] REN Di, WAN Jian, YIN Yu-yu, ZHOU Li, GAO Min. Web services QoS prediction method based on Bayes classification[J]. Journal of ZheJiang University (Engineering Science), 2017, 51(6): 1242-1251.
[15] MAO Yi-yu, LIU Jian-xun, HU Rong, TANG Ming-dong. Collaborative filtering algorithm based on Logistic function and user clustering[J]. Journal of ZheJiang University (Engineering Science), 2017, 51(6): 1252-1258.