Please wait a minute...
浙江大学学报(工学版)  2022, Vol. 56 Issue (5): 1025-1034    DOI: 10.3785/j.issn.1008-973X.2022.05.020
计算机与控制工程     
关系生成图注意力网络的知识图谱链接预测
陈成1(),张皞2,李永强1,*(),冯远静1
1. 浙江工业大学 信息工程学院,浙江 杭州 310023
2. 中国移动通信集团浙江有限公司杭州分公司,浙江 杭州 310006
Knowledge graph link prediction based on relational generative graph attention network
Cheng CHEN1(),Hao ZHANG2,Yong-qiang LI1,*(),Yuan-jing FENG1
1. College of Information Engineering, Zhejiang University of Technology, Hangzhou 310023, China
2. China Mobile Zhejiang Limited Company Hangzhou Branch Company, Hangzhou 310006, China
 全文: PDF(1020 KB)   HTML
摘要:

针对实体邻域三元组缺少联系的问题,提出基于关系生成图注意力网络(RGGAT)的知识图谱链接预测方法. 利用不同类型的关系生成相应的注意力机制参数,邻域三元组按照关系类型使用对应的参数计算注意力系数. 实体通过聚合以关系为主导的邻域三元组信息得到更丰富的嵌入向量. 在训练过程中对编码器和解码器进行共同训练,将编码器更新的实体向量和关系向量直接输入到解码器中,保证编码器和解码器训练目标一致. 在3个公开数据集上进行链接预测实验,对比实验选用目前主流的5个模型作为基线. RGGAT方法在3个数据集上的Hits@10能达到0.519 8、0.510 4和0.973 9,高于传统图注意力网络嵌入方法的. 在邻域聚合阶数对比实验中,1阶关系邻域聚合的方法相比2阶关系在Hits@10上提升3.59%.

关键词: 知识图谱图注意力网络实体邻域关系生成参数链接预测    
Abstract:

A knowledge graph link prediction method for relational generative graph attention network (RGGAT) was proposed to address the problem of missing links in entity neighborhood triples. Different types of relation were used to generate the corresponding attention mechanism parameters, and the attention coefficient was calculated by the neighborhood triples through the corresponding parameters according to the relation types. The entity got a richer embedding vector by aggregating the relation-dominated neighborhood triples information. The encoder and the decoder were jointly trained during the training process, and the entity vector and relation vector updated by the encoder were directly input into the decoder to ensure that the training objectives of the encoder and the decoder were consistent. The link prediction experiment was carried out on three public datasets, and five current mainstream models were selected as the baseline for the comparison experiment. The Hits@10 of RGGAT method on the three datasets were 0.519 8, 0.510 4 and 0.973 9, higher than that of the traditional graph attention network embedding method. In the comparison experiment of neighborhood aggregation order, the Hits@10 of the neighborhood aggregation method for one-hop relation was improved by 3.59% compared with the method for two-hop relation.

Key words: knowledge graph    graph attention network    entity neighborhood    relational generative parameter    link prediction
收稿日期: 2021-06-07 出版日期: 2022-05-31
CLC:  TP 391  
基金资助: 国家自然科学基金资助项目(62073294);浙江省自然科学基金资助项目(LZ21F030003)
通讯作者: 李永强     E-mail: cauchychen@126.com;yqli@zjut.edu.cn
作者简介: 陈成(1996—),男,硕士生,从事知识图谱推理研究. orcid.org/0000-0002-1051-6436. E-mail: cauchychen@126.com
服务  
把本文推荐给朋友
加入引用管理器
E-mail Alert
作者相关文章  
陈成
张皞
李永强
冯远静

引用本文:

陈成,张皞,李永强,冯远静. 关系生成图注意力网络的知识图谱链接预测[J]. 浙江大学学报(工学版), 2022, 56(5): 1025-1034.

Cheng CHEN,Hao ZHANG,Yong-qiang LI,Yuan-jing FENG. Knowledge graph link prediction based on relational generative graph attention network. Journal of ZheJiang University (Engineering Science), 2022, 56(5): 1025-1034.

链接本文:

https://www.zjujournals.com/eng/CN/10.3785/j.issn.1008-973X.2022.05.020        https://www.zjujournals.com/eng/CN/Y2022/V56/I5/1025

图 1  实体邻域聚合过程图
图 2  关系生成图注意力网络模型框架图
图 3  实体多阶邻域三元组示意图
图 4  关系生成图注意力机制
图 5  链接预测训练过程图
数据集 en rn tr va te da dm
WN18RR 40 493 11 86 835 3 034 3 134 4.24 3
FB15K-237 14 541 237 272 115 17 535 20 466 37.43 22
Kinship 104 25 8 544 1 068 1 074 164.31 164
表 1  知识图谱公开数据集统计信息
模型 MRR MR Hits@1 Hits@3 Hits@10
TransE 0.2430 2 300 ? ? 0.5010
R-GCN 0.1230 6 700 0.0800 0.1370 0.2070
ConvKB 0.2480 2 554 0.042 7 0.4450 0.5250
A2N 0.4500 ? 0.4200 0.4600 0.5100
KBGAT 0.415 1 1 954 0.338 1 0.457 3 0.554 0
RGGAT 0.414 1 2 628 0.353 7 0.449 6 0.519 8
表 2  在WN18RR数据集上的链接预测结果
模型 MRR MR Hits@1 Hits@3 Hits@10
TransE 0.2790 323 0.1980 0.3760 0.4410
R-GCN 0.1640 600 0.1000 0.1810 0.3000
ConvKB 0.2890 216 0.1980 0.3240 0.4710
A2N 0.3170 ? 0.2320 0.3480 0.4860
KBGAT 0.208 3 264 0.129 5 0.222 0 0.374 6
RGGAT 0.332 6 235 0.244 7 0.364 2 0.510 4
表 3  在FB15K-237数据集上的链接预测结果
模型 MRR MR Hits@1 Hits@3 Hits@10
TransE 0.3090 6.80 0.0090 0.6430 0.8410
R-GCN 0.1090 25.90 0.0300 0.0880 0.2390
ConvKB 0.6140 3.30 0.436 2 0.7550 0.9530
KBGAT 0.727 1 2.70 0.584 3 0.846 4 0.966 0
RGGAT 0.814 5 2.37 0.716 0 0.900 8 0.973 9
表 4  在Kinship数据集上的链接预测结果
阶数 MRR MR Hits@1 Hits@3 Hits@10
1 0.414 1 2 628 0.353 7 0.449 6 0.519 8
2 0.391 6 3 740 0.328 8 0.431 4 0.501 8
3 0.351 9 4 026 0.272 5 0.407 3 0.483 1
表 5  在WN18RR数据集上多阶关系的链接预测结果
逆关系 MRR MR Hits@1 Hits@3 Hits@10
使用 0.414 1 2 628 0.353 7 0.449 6 0.519 8
不使用 0.378 9 3 109 0.335 0 0.404 0 0.453 9
表 6  逆关系对链接预测结果的影响
模型 MRR Hits@1 Hits@3 Hits@10
KBGAT+ConvKB 0.727 1 0.584 3 0.846 4 0.966 0
KBGAT-ConvKB 0.774 2 0.671 3 0.862 2 0.965 6
RGGAT-ConvKB 0.781 8 0.670 4 0.868 7 0.975 8
RGGAT-ConvE 0.814 5 0.716 0 0.900 8 0.973 9
表 7  在Kinship数据集上的链接预测结果
1 DALTON J, DIETA L, ALLAN J. Entity query feature expansion using knowledge base links [C]// Proceedings of the 37th International ACM SIGIR Conference on Research and Development in Information Retrieval. Gold Coast: ACM, 2014: 365-374.
2 FERRUCCI D, BROWN E, CHU-CARROLL J, et al Building Watson: an overview of the DeepQA project[J]. AI Magazine, 2010, 31 (3): 59- 79
doi: 10.1609/aimag.v31i3.2303
3 MINTZ M, BILLS S, SNOW R, et al. Distant supervision for relation extraction without labeled data [C]// Proceedings of the Joint Conference of the 47th Annual Meeting of the ACL and the 4th International Joint Conference on Natural Language Processing of the Asian Federation of Natural Language Processing. Singapore: ACL, 2009: 1003-1011.
4 BOLLACKER K, EVANS C, PARITOSH P, et al. Freebase: a collaboratively created graph database for structuring human knowledge [C]// Proceedings of the 2008 ACM SIGMOD International Conference on Management of Data. Vancouver: ACM, 2008: 1247-1250.
5 官赛萍, 靳小龙, 贾岩涛, 等 面向知识图谱的知识推理研究进展[J]. 软件学报, 2018, 29 (10): 2966- 2994
GUAN Sai-ping, JIN Xiao-long, JIA Yan-tao, et al Knowledge reasoning over knowledge graph: a survey[J]. Journal of Software, 2018, 29 (10): 2966- 2994
6 BORDES A, USUNIER N, GARCIA-DURAN A, et al Translating embeddings for modeling multi-relational data[J]. Advances in Neural Information Processing Systems, 2013, 26: 2787- 2795
7 NICKEL M, TRESP V, KRIEGEL H P. A three-way model for collective learning on multi-relational data [C]// Proceedings of the 28th International Conference on Machine Learning. Bellevue: ACM, 2011.
8 DETTMERS T, MINERVINI P, STENETORP P, et al. Convolutional 2D knowledge graph embeddings [C]// Proceedings of the 32nd AAAI Conference on Artificial Intelligence. New Orleans: AAAI, 2018: 1811-1818.
9 张仲伟, 曹雷, 陈希亮, 等 基于神经网络的知识推理研究综述[J]. 计算机工程与应用, 2019, 55 (12): 8- 19+36
ZHANG Zhong-wei, CAO Lei, CHEN Xi-liang, et al Survey of knowledge reasoning based on neural network[J]. Computer Engineering and Applications, 2019, 55 (12): 8- 19+36
doi: 10.3778/j.issn.1002-8331.1901-0358
10 SCHLICHTKRULL M, KIPF T N, BLOEM P, et al. Modeling relational data with graph convolutional networks [C]// Proceedings of the 15th European Semantic Web Conference . Heraklion: Springer, 2018: 593-607.
11 MARCHEGGIANI D, TITOV I. Encoding sentences with graph convolutional networks for semantic role labeling [C]// Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing. Copenhagen: ACL, 2017: 1506-1515.
12 SHANG C, TANG Y, HUANG J, et al. End-to-end structure-aware convolutional networks for knowledge base completion [C]// Proceedings of the 33rd AAAI Conference on Artificial Intelligence. Honolulu: AAAI, 2019: 3060-3067.
13 NATHANI D, CHAUHAN J, SHARMA C, et al. Learning attention-based embeddings for relation prediction in knowledge graphs [C]// Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics. Florence: ACL, 2019: 4710-4723.
14 SOCHER R, CHEN D, MANNING C D, et al. Reasoning with neural tensor networks for knowledge base completion [C]// Proceedings of the 27th Conference on Neural Information Processing Systems. Lake Tahoe: MIT Press, 2013: 926-934.
15 NICKEL M, ROSASCO L, POGGIO T. Holographic embeddings of knowledge graphs [C]// Proceedings of the 30th AAAI Conference on Artificial Intelligence. Phoenix: AAAI, 2016: 1955-1961.
16 BALAZEVIC I, ALLEN C, HOSPEDALES T. TuckER: tensor factorization for knowledge graph completion [C]// Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing. Hong Kong: ACL, 2019: 5188-5197.
17 WANG Z, ZHANG J, FENG J, et al. Knowledge graph embedding by translating on hyperplanes [C]// Proceedings of the 28th AAAI Conference on Artificial Intelligence. Quebec: AAAI, 2014: 1112-1119.
18 LIN Y, LIU Z, SUN M, et al. Learning entity and relation embeddings for knowledge graph completion [C]// Proceedings of the 29th AAAI Conference on Artificial Intelligence. Austin: AAAI, 2015: 2181-2187.
19 JI G, HE S, XU L, et al. Knowledge graph embedding via dynamic mapping matrix [C]// Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing. Beijing: ACL, 2015: 687-696.
20 FAN M, ZHOU Q, CHANG E, et al. Transition-based knowledge graph embedding with relational mapping properties [C]// Proceedings of the 28th Pacific Asia Conference on Language, Information and Computing. Hong Kong: [s.n.], 2014: 328-337.
21 XIAO H, HUANG M, HAO Y, et al. TransA: an adaptive approach for knowledge graph embedding [EB/OL]. [2021-05-10]. https://arxiv.org/pdf/1509.05490v1.pdf.
22 EBISU T, ICHISE R. Toruse: knowledge graph embedding on a lie group [C]// Proceedings of the 32nd AAAI Conference on Artificial Intelligence. New Orleans: AAAI, 2018: 1819-1826.
23 SUN Z, DENG Z H, NIE J Y, et al. RotatE: knowledge graph embedding by relational rotation in complex space [C]// Proceedings of the 6th International Conference on Learning Representations. Vancouver: [s. n. ], 2018.
24 ZHANG Z, CAI J, ZHANG Y, et al. Learning hierarchy-aware knowledge graph embeddings for link prediction [C]// Proceedings of the 34th AAAI Conference on Artificial Intelligence. New York: AAAI, 2020: 3065-3072.
25 ZHANG S, TAY Y, YAO L, et al. Quaternion knowledge graph embeddings [C]// Proceedings of the 33rd Conference on Neural Information Processing Systems. Vancouver: MIT Press, 2019: 2735-2745.
26 NGUYEN T D, NGUYEN D Q, PHUNG D. A novel embedding model for knowledge base completion based on convolutional neural network [C]// Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. New Orleans: NAACL, 2018: 327-333.
27 VASHISHTH S, SANYAL S, NITIN V, et al. InteractE: improving convolution-based knowledge graph embeddings by increasing feature interactions [C]// Proceedings of the 34th AAAI Conference on Artificial Intelligence. New York: AAAI, 2020: 3009-3016.
28 STOICA G, STRETCU O, PLATANIOS E A, et al. Contextual parameter generation for knowledge graph link prediction [C]// Proceedings of the 34th AAAI Conference on Artificial Intelligence. New York: AAAI, 2020: 3000-3008.
29 YANG B, YIH W, HE X, et al. Embedding entities and relations for learning and inference in knowledge bases[EB/OL]. [2021- 05 -10]. https://arxiv.org/pdf/1412.6575v4.pdf.
30 VASHISHTH S, SANYAL S, NITIN V, et al. Composition-based multi-relational graph convolutional networks [C]// Proceedings of the 7th International Conference on Learning Representations. New Orleans: [s. n. ], 2019.
31 WANG R, LI B, HU S, et al Knowledge graph embedding via graph attenuated attention networks[J]. IEEE Access, 2019, 8: 5212- 5224
32 VASWANI A, SHAZEER N, PARMAR N, et al. Attention is all you need [C]// Proceedings of the 31st Conference on Neural Information Processing Systems. Long Beach: MIT Press, 2017: 5998-6008.
33 VELICKOVIC P, CUCURULL G, CASANOVA A, et al. Graph attention networks [C]// International Conference on Learning Representations. Vancouver: [s. n. ], 2018.
34 SUN Z, VASHISHTH S, SANYAL S, et al. A re-evaluation of knowledge graph completion methods [C]// Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics. Seattle: ACL, 2020: 5516-5522.
35 BANSAL T, JUAN D C, RAVI S, et al. A2N: attending to neighbors for knowledge graph inference [C]// Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics. Florence: ACL, 2019: 4387-4392.
36 TOUTANOVA K, CHEN D, PANTEL P, et al. Representing text for joint embedding of text and knowledge bases [C]// Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing. Lisbon: ACL, 2015: 1499-1509.
[1] 张林, 程华, 房一泉. 基于卷积神经网络的链接表示及预测方法[J]. 浙江大学学报(工学版), 2018, 52(3): 552-559.
[2] 戴彩艳, 陈崚, 李斌, 陈伯伦. 复杂网络中的抽样链接预测[J]. 浙江大学学报(工学版), 2017, 51(3): 554-561.
[3] 郭景峰,刘苗苗,罗旭. 加权网络中基于多路径节点相似性的链接预测[J]. 浙江大学学报(工学版), 2016, 50(7): 1347-1352.