Please wait a minute...
Journal of ZheJiang University (Engineering Science)  2022, Vol. 56 Issue (5): 1025-1034    DOI: 10.3785/j.issn.1008-973X.2022.05.020
    
Knowledge graph link prediction based on relational generative graph attention network
Cheng CHEN1(),Hao ZHANG2,Yong-qiang LI1,*(),Yuan-jing FENG1
1. College of Information Engineering, Zhejiang University of Technology, Hangzhou 310023, China
2. China Mobile Zhejiang Limited Company Hangzhou Branch Company, Hangzhou 310006, China
Download: HTML     PDF(1020KB) HTML
Export: BibTeX | EndNote (RIS)      

Abstract  

A knowledge graph link prediction method for relational generative graph attention network (RGGAT) was proposed to address the problem of missing links in entity neighborhood triples. Different types of relation were used to generate the corresponding attention mechanism parameters, and the attention coefficient was calculated by the neighborhood triples through the corresponding parameters according to the relation types. The entity got a richer embedding vector by aggregating the relation-dominated neighborhood triples information. The encoder and the decoder were jointly trained during the training process, and the entity vector and relation vector updated by the encoder were directly input into the decoder to ensure that the training objectives of the encoder and the decoder were consistent. The link prediction experiment was carried out on three public datasets, and five current mainstream models were selected as the baseline for the comparison experiment. The Hits@10 of RGGAT method on the three datasets were 0.519 8, 0.510 4 and 0.973 9, higher than that of the traditional graph attention network embedding method. In the comparison experiment of neighborhood aggregation order, the Hits@10 of the neighborhood aggregation method for one-hop relation was improved by 3.59% compared with the method for two-hop relation.



Key wordsknowledge graph      graph attention network      entity neighborhood      relational generative parameter      link prediction     
Received: 07 June 2021      Published: 31 May 2022
CLC:  TP 391  
Fund:  国家自然科学基金资助项目(62073294);浙江省自然科学基金资助项目(LZ21F030003)
Corresponding Authors: Yong-qiang LI     E-mail: cauchychen@126.com;yqli@zjut.edu.cn
Cite this article:

Cheng CHEN,Hao ZHANG,Yong-qiang LI,Yuan-jing FENG. Knowledge graph link prediction based on relational generative graph attention network. Journal of ZheJiang University (Engineering Science), 2022, 56(5): 1025-1034.

URL:

https://www.zjujournals.com/eng/10.3785/j.issn.1008-973X.2022.05.020     OR     https://www.zjujournals.com/eng/Y2022/V56/I5/1025


关系生成图注意力网络的知识图谱链接预测

针对实体邻域三元组缺少联系的问题,提出基于关系生成图注意力网络(RGGAT)的知识图谱链接预测方法. 利用不同类型的关系生成相应的注意力机制参数,邻域三元组按照关系类型使用对应的参数计算注意力系数. 实体通过聚合以关系为主导的邻域三元组信息得到更丰富的嵌入向量. 在训练过程中对编码器和解码器进行共同训练,将编码器更新的实体向量和关系向量直接输入到解码器中,保证编码器和解码器训练目标一致. 在3个公开数据集上进行链接预测实验,对比实验选用目前主流的5个模型作为基线. RGGAT方法在3个数据集上的Hits@10能达到0.519 8、0.510 4和0.973 9,高于传统图注意力网络嵌入方法的. 在邻域聚合阶数对比实验中,1阶关系邻域聚合的方法相比2阶关系在Hits@10上提升3.59%.


关键词: 知识图谱,  图注意力网络,  实体邻域,  关系生成参数,  链接预测 
Fig.1 Process diagram of entity neighborhood aggregation
Fig.2 Framework illustration of relational generative attention network model
Fig.3 Illustration of entity multi-hop neighborhood triples
Fig.4 Attention mechanism of relational generative graph
Fig.5 Process diagram of link prediction training
数据集 en rn tr va te da dm
WN18RR 40 493 11 86 835 3 034 3 134 4.24 3
FB15K-237 14 541 237 272 115 17 535 20 466 37.43 22
Kinship 104 25 8 544 1 068 1 074 164.31 164
Tab.1 Knowledge graph public dataset statistics
模型 MRR MR Hits@1 Hits@3 Hits@10
TransE 0.2430 2 300 ? ? 0.5010
R-GCN 0.1230 6 700 0.0800 0.1370 0.2070
ConvKB 0.2480 2 554 0.042 7 0.4450 0.5250
A2N 0.4500 ? 0.4200 0.4600 0.5100
KBGAT 0.415 1 1 954 0.338 1 0.457 3 0.554 0
RGGAT 0.414 1 2 628 0.353 7 0.449 6 0.519 8
Tab.2 Link prediction results on WN18RR dataset
模型 MRR MR Hits@1 Hits@3 Hits@10
TransE 0.2790 323 0.1980 0.3760 0.4410
R-GCN 0.1640 600 0.1000 0.1810 0.3000
ConvKB 0.2890 216 0.1980 0.3240 0.4710
A2N 0.3170 ? 0.2320 0.3480 0.4860
KBGAT 0.208 3 264 0.129 5 0.222 0 0.374 6
RGGAT 0.332 6 235 0.244 7 0.364 2 0.510 4
Tab.3 Link prediction results on FB15K-237 dataset
模型 MRR MR Hits@1 Hits@3 Hits@10
TransE 0.3090 6.80 0.0090 0.6430 0.8410
R-GCN 0.1090 25.90 0.0300 0.0880 0.2390
ConvKB 0.6140 3.30 0.436 2 0.7550 0.9530
KBGAT 0.727 1 2.70 0.584 3 0.846 4 0.966 0
RGGAT 0.814 5 2.37 0.716 0 0.900 8 0.973 9
Tab.4 Link prediction results on Kinship dataset
阶数 MRR MR Hits@1 Hits@3 Hits@10
1 0.414 1 2 628 0.353 7 0.449 6 0.519 8
2 0.391 6 3 740 0.328 8 0.431 4 0.501 8
3 0.351 9 4 026 0.272 5 0.407 3 0.483 1
Tab.5 Link prediction results on WN18RR dataset using multi-hop relation
逆关系 MRR MR Hits@1 Hits@3 Hits@10
使用 0.414 1 2 628 0.353 7 0.449 6 0.519 8
不使用 0.378 9 3 109 0.335 0 0.404 0 0.453 9
Tab.6 Effect of inverse relation on link prediction results
模型 MRR Hits@1 Hits@3 Hits@10
KBGAT+ConvKB 0.727 1 0.584 3 0.846 4 0.966 0
KBGAT-ConvKB 0.774 2 0.671 3 0.862 2 0.965 6
RGGAT-ConvKB 0.781 8 0.670 4 0.868 7 0.975 8
RGGAT-ConvE 0.814 5 0.716 0 0.900 8 0.973 9
Tab.7 Link prediction results on Kinship dataset
[1]   DALTON J, DIETA L, ALLAN J. Entity query feature expansion using knowledge base links [C]// Proceedings of the 37th International ACM SIGIR Conference on Research and Development in Information Retrieval. Gold Coast: ACM, 2014: 365-374.
[2]   FERRUCCI D, BROWN E, CHU-CARROLL J, et al Building Watson: an overview of the DeepQA project[J]. AI Magazine, 2010, 31 (3): 59- 79
doi: 10.1609/aimag.v31i3.2303
[3]   MINTZ M, BILLS S, SNOW R, et al. Distant supervision for relation extraction without labeled data [C]// Proceedings of the Joint Conference of the 47th Annual Meeting of the ACL and the 4th International Joint Conference on Natural Language Processing of the Asian Federation of Natural Language Processing. Singapore: ACL, 2009: 1003-1011.
[4]   BOLLACKER K, EVANS C, PARITOSH P, et al. Freebase: a collaboratively created graph database for structuring human knowledge [C]// Proceedings of the 2008 ACM SIGMOD International Conference on Management of Data. Vancouver: ACM, 2008: 1247-1250.
[5]   官赛萍, 靳小龙, 贾岩涛, 等 面向知识图谱的知识推理研究进展[J]. 软件学报, 2018, 29 (10): 2966- 2994
GUAN Sai-ping, JIN Xiao-long, JIA Yan-tao, et al Knowledge reasoning over knowledge graph: a survey[J]. Journal of Software, 2018, 29 (10): 2966- 2994
[6]   BORDES A, USUNIER N, GARCIA-DURAN A, et al Translating embeddings for modeling multi-relational data[J]. Advances in Neural Information Processing Systems, 2013, 26: 2787- 2795
[7]   NICKEL M, TRESP V, KRIEGEL H P. A three-way model for collective learning on multi-relational data [C]// Proceedings of the 28th International Conference on Machine Learning. Bellevue: ACM, 2011.
[8]   DETTMERS T, MINERVINI P, STENETORP P, et al. Convolutional 2D knowledge graph embeddings [C]// Proceedings of the 32nd AAAI Conference on Artificial Intelligence. New Orleans: AAAI, 2018: 1811-1818.
[9]   张仲伟, 曹雷, 陈希亮, 等 基于神经网络的知识推理研究综述[J]. 计算机工程与应用, 2019, 55 (12): 8- 19+36
ZHANG Zhong-wei, CAO Lei, CHEN Xi-liang, et al Survey of knowledge reasoning based on neural network[J]. Computer Engineering and Applications, 2019, 55 (12): 8- 19+36
doi: 10.3778/j.issn.1002-8331.1901-0358
[10]   SCHLICHTKRULL M, KIPF T N, BLOEM P, et al. Modeling relational data with graph convolutional networks [C]// Proceedings of the 15th European Semantic Web Conference . Heraklion: Springer, 2018: 593-607.
[11]   MARCHEGGIANI D, TITOV I. Encoding sentences with graph convolutional networks for semantic role labeling [C]// Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing. Copenhagen: ACL, 2017: 1506-1515.
[12]   SHANG C, TANG Y, HUANG J, et al. End-to-end structure-aware convolutional networks for knowledge base completion [C]// Proceedings of the 33rd AAAI Conference on Artificial Intelligence. Honolulu: AAAI, 2019: 3060-3067.
[13]   NATHANI D, CHAUHAN J, SHARMA C, et al. Learning attention-based embeddings for relation prediction in knowledge graphs [C]// Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics. Florence: ACL, 2019: 4710-4723.
[14]   SOCHER R, CHEN D, MANNING C D, et al. Reasoning with neural tensor networks for knowledge base completion [C]// Proceedings of the 27th Conference on Neural Information Processing Systems. Lake Tahoe: MIT Press, 2013: 926-934.
[15]   NICKEL M, ROSASCO L, POGGIO T. Holographic embeddings of knowledge graphs [C]// Proceedings of the 30th AAAI Conference on Artificial Intelligence. Phoenix: AAAI, 2016: 1955-1961.
[16]   BALAZEVIC I, ALLEN C, HOSPEDALES T. TuckER: tensor factorization for knowledge graph completion [C]// Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing. Hong Kong: ACL, 2019: 5188-5197.
[17]   WANG Z, ZHANG J, FENG J, et al. Knowledge graph embedding by translating on hyperplanes [C]// Proceedings of the 28th AAAI Conference on Artificial Intelligence. Quebec: AAAI, 2014: 1112-1119.
[18]   LIN Y, LIU Z, SUN M, et al. Learning entity and relation embeddings for knowledge graph completion [C]// Proceedings of the 29th AAAI Conference on Artificial Intelligence. Austin: AAAI, 2015: 2181-2187.
[19]   JI G, HE S, XU L, et al. Knowledge graph embedding via dynamic mapping matrix [C]// Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing. Beijing: ACL, 2015: 687-696.
[20]   FAN M, ZHOU Q, CHANG E, et al. Transition-based knowledge graph embedding with relational mapping properties [C]// Proceedings of the 28th Pacific Asia Conference on Language, Information and Computing. Hong Kong: [s.n.], 2014: 328-337.
[21]   XIAO H, HUANG M, HAO Y, et al. TransA: an adaptive approach for knowledge graph embedding [EB/OL]. [2021-05-10]. https://arxiv.org/pdf/1509.05490v1.pdf.
[22]   EBISU T, ICHISE R. Toruse: knowledge graph embedding on a lie group [C]// Proceedings of the 32nd AAAI Conference on Artificial Intelligence. New Orleans: AAAI, 2018: 1819-1826.
[23]   SUN Z, DENG Z H, NIE J Y, et al. RotatE: knowledge graph embedding by relational rotation in complex space [C]// Proceedings of the 6th International Conference on Learning Representations. Vancouver: [s. n. ], 2018.
[24]   ZHANG Z, CAI J, ZHANG Y, et al. Learning hierarchy-aware knowledge graph embeddings for link prediction [C]// Proceedings of the 34th AAAI Conference on Artificial Intelligence. New York: AAAI, 2020: 3065-3072.
[25]   ZHANG S, TAY Y, YAO L, et al. Quaternion knowledge graph embeddings [C]// Proceedings of the 33rd Conference on Neural Information Processing Systems. Vancouver: MIT Press, 2019: 2735-2745.
[26]   NGUYEN T D, NGUYEN D Q, PHUNG D. A novel embedding model for knowledge base completion based on convolutional neural network [C]// Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. New Orleans: NAACL, 2018: 327-333.
[27]   VASHISHTH S, SANYAL S, NITIN V, et al. InteractE: improving convolution-based knowledge graph embeddings by increasing feature interactions [C]// Proceedings of the 34th AAAI Conference on Artificial Intelligence. New York: AAAI, 2020: 3009-3016.
[28]   STOICA G, STRETCU O, PLATANIOS E A, et al. Contextual parameter generation for knowledge graph link prediction [C]// Proceedings of the 34th AAAI Conference on Artificial Intelligence. New York: AAAI, 2020: 3000-3008.
[29]   YANG B, YIH W, HE X, et al. Embedding entities and relations for learning and inference in knowledge bases[EB/OL]. [2021- 05 -10]. https://arxiv.org/pdf/1412.6575v4.pdf.
[30]   VASHISHTH S, SANYAL S, NITIN V, et al. Composition-based multi-relational graph convolutional networks [C]// Proceedings of the 7th International Conference on Learning Representations. New Orleans: [s. n. ], 2019.
[31]   WANG R, LI B, HU S, et al Knowledge graph embedding via graph attenuated attention networks[J]. IEEE Access, 2019, 8: 5212- 5224
[32]   VASWANI A, SHAZEER N, PARMAR N, et al. Attention is all you need [C]// Proceedings of the 31st Conference on Neural Information Processing Systems. Long Beach: MIT Press, 2017: 5998-6008.
[33]   VELICKOVIC P, CUCURULL G, CASANOVA A, et al. Graph attention networks [C]// International Conference on Learning Representations. Vancouver: [s. n. ], 2018.
[34]   SUN Z, VASHISHTH S, SANYAL S, et al. A re-evaluation of knowledge graph completion methods [C]// Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics. Seattle: ACL, 2020: 5516-5522.
[35]   BANSAL T, JUAN D C, RAVI S, et al. A2N: attending to neighbors for knowledge graph inference [C]// Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics. Florence: ACL, 2019: 4387-4392.
[36]   TOUTANOVA K, CHEN D, PANTEL P, et al. Representing text for joint embedding of text and knowledge bases [C]// Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing. Lisbon: ACL, 2015: 1499-1509.
[1] ZHANG Lin, CHENG Hua, FANG Yi-quan. CNN-based link representation and prediction method[J]. Journal of ZheJiang University (Engineering Science), 2018, 52(3): 552-559.
[2] DAI Cai-yan, CHEN Ling, LI Bin, CHEN Bo-lun. Sampling-based link prediction in complex networks[J]. Journal of ZheJiang University (Engineering Science), 2017, 51(3): 554-561.
[3] GUO Jing feng,LIU Miao miao,LUO Xu. Link prediction based on similarity of nodes of multipath in weighted social networks[J]. Journal of ZheJiang University (Engineering Science), 2016, 50(7): 1347-1352.