Please wait a minute...
浙江大学学报(工学版)  2023, Vol. 57 Issue (2): 235-242    DOI: 10.3785/j.issn.1008-973X.2023.02.003
计算机技术     
基于关系聚合的时序知识图谱表示学习
苏丰龙(),景宁*()
国防科技大学 电子科学学院,湖南 长沙 410073
Temporal knowledge graph representation learning based on relational aggregation
Feng-long SU(),Ning JING*()
School of Electronic Science, National University of Defense Technology, Changsha 410073, China
 全文: PDF(772 KB)   HTML
摘要:

针对静态知识图表示方法不能对时间进行建模的局限性,从时序图谱实际应用的需求出发,设计了基于关系聚合的时序图谱表示学习方法来描述和推理动态知识图谱的时间信息. 与离散的快照时序网络不同,将时间信息视为实体间的链接属性,提出利用时间感知的关系图注意力编码器来学习时序图谱的实体表征. 将中心节点的邻域关系和时间戳融入图结构中,然后分配不同的权重,高效地聚合时间知识. 在公开的时序知识图谱数据集上运行,结果表明,与传统的时序图谱编码框架相比,面向注意力聚合的时序图谱表示学习方法在补全和对齐任务的性能上都有较强的竞争优势,尤其对高时间敏感度实体更加显著,体现出算法的优越性和强鲁棒性.

关键词: 图注意力网络时序知识图谱表示学习时间感知关系聚合    
Abstract:

Aiming at the limitation that static knowledge graph representation learning methods cannot model time, a temporal graph representation learning method based on relational aggregation was designed to describe and reason about the temporal information of dynamic knowledge graphs from the demand of practical applications. Different from the discrete snapshot temporal neural networks, temporal information was treated as a link property among entities. A time-aware relational graph attention encoder was used to learn entity representations of temporal knowledge graphs, while the neighborhood relations and time stamps of central nodes were incorporated into the graph structure, and then different weights were assigned to aggregate temporal knowledge efficiently. Results of running on public datasets showed that, compared with traditional temporal graph encoder frameworks, the attention aggregation network had a strong competitive advantage in the performance of both the complementation and alignment tasks, especially for highly time-sensitive entities, reflecting the superiority and strong robustness of the algorithm.

Key words: graph attention network    temporal knowledge graph    representation learning    time-awareness    relational aggregation
收稿日期: 2022-08-01 出版日期: 2023-02-28
CLC:  TP 391  
通讯作者: 景宁     E-mail: xueshu2021@qq.com;jingningnudt@163.com
作者简介: 苏丰龙(1988—),男,博士,从事知识图谱研究. orcid.org/0000-0002-7595-7516. E-mail: xueshu2021@qq.com
服务  
把本文推荐给朋友
加入引用管理器
E-mail Alert
作者相关文章  
苏丰龙
景宁

引用本文:

苏丰龙,景宁. 基于关系聚合的时序知识图谱表示学习[J]. 浙江大学学报(工学版), 2023, 57(2): 235-242.

Feng-long SU,Ning JING. Temporal knowledge graph representation learning based on relational aggregation. Journal of ZheJiang University (Engineering Science), 2023, 57(2): 235-242.

链接本文:

https://www.zjujournals.com/eng/CN/10.3785/j.issn.1008-973X.2023.02.003        https://www.zjujournals.com/eng/CN/Y2023/V57/I2/235

图 1  用于 TKG 补全和时间感知 EA 的 TA-GAT 模型框架
数据集 Num 时间戳
节点 关系 时间点 训练 验证 测试 四元组
ICEWS14 7128 230 365 72826 8941 8963 90730 2014
ICEWS05-15 10488 251 4017 386962 46275 46092 479329 2005—2015
GDELT-500 500 20 366 2735685 341961 341961 3419607 2015.04.01—2016.03.31
表 1  关于 TKGC 数据集的统计信息
数据集 Num
节点1 节点2 关系1 关系2 时间戳 四元组1 四元组2 实体对 种子
DICEWS-1K/DICEWS-200 9517 9537 247 246 4017 307552 307553 8566 1000/200
YAGO-WIKI50K 49629 49222 11 30 245 221050 317814 49172 5000
YAGO-WIKI20K 19493 19929 32 130 405 83583 142568 19462 400
表 2  时间感知实体对齐数据集
模型 ICEWS14 ICEWS05-15 GDELT-500
MRR Hit@1 Hit@3 Hit@10 MRR Hit@1 Hit@3 Hit@10 MRR Hit@1 Hit@3 Hit@10
TTransE 0.255 0.074 ? 0.601 0.271 0.084 ? 0.616 0.115 0.000 0.160 0.318
HyTE 0.297 0.108 0.416 0.655 0.316 0.116 0.445 0.681 0.118 0.000 0.165 0.326
TA-DistMult 0.477 0.363 ? 0.686 0.474 0.346 ? 0.728 0.206 0.124 0.219 0.365
DE-SimplE 0.526 0.418 0.592 0.725 0.513 0.392 0.578 0.748 0.230 0.141 0.248 0.403
TeMP-SA 0.607 0.484 0.684 $\underline{\underline {\boldsymbol{0.840} } } $ $ \underline{\underline {\boldsymbol{0.680} } }$ 0.553 $\underline{\underline {\boldsymbol{0.769} } } $ $\underline{\underline {\boldsymbol{0.913 } } }$ 0.232 0.152 0.245 0.377
TNTComplEx 0.620 0.520 0.660 0.760 0.670 0.590 0.710 0.810 ? ? ? ?
ChronoR 0.625 0.547 0.669 0.773 0.675 0.596 0.723 0.820 ? ? ? ?
TA-GAT $\underline{\underline {\boldsymbol{0.637} } }$ $\underline{\underline {\boldsymbol{0.556} } } $ $\underline{\underline {\boldsymbol{0.685} } } $ 0.790 0.676 $\underline{\underline {\boldsymbol{0.597} } } $ 0.720 0.816 $\underline{\underline {\boldsymbol{0.241} } } $ $\underline{\underline {\boldsymbol{0.153} } } $ $\underline{\underline {\boldsymbol{0.252} } } $ $\underline{\underline {\boldsymbol{0.421} } } $
表 3  TA-GAT在时序知识图谱上的补全结果
模型 DICEWS-1K DICEWS-200 YAGO-WIKI50K
MRR Hits@1 Hits@10 MRR Hits@1 Hits@10 MRR Hits@1 Hits@10
MTransE 0.150 0.101 0.241 0.104 0.067 0.175 0.322 0.242 0.477
JAPE 0.198 0.144 0.298 0.138 0.098 0.210 0.345 0.271 0.488
AlignE 0.593 0.508 0.751 0.303 0.222 0.457 0.800 0.756 0.883
GCN-Align 0.291 0.204 0.466 0.231 0.165 0.363 0.581 0.512 0.711
MuGNN 0.617 0.525 0.794 0.412 0.367 0.583 0.808 0.762 0.890
MRAEA 0.745 0.675 0.870 0.564 0.476 0.733 0.848 0.806 0.913
RREA 0.780 0.722 0.883 0.719 0.659 0.824 0.868 0.828 0.938
TU-GAT 0.748 0.681 0.870 0.576 0.489 0.739 0.815 0.767 0.902
TA-GAT $\underline{\underline {\boldsymbol{0.900} } } $ $\underline{\underline {\boldsymbol{0.876} } } $ $\underline{\underline {\boldsymbol{0.942} } } $ $\underline{\underline {\boldsymbol{0.849} } } $ $\underline{\underline {\boldsymbol{0.815} } } $ $\underline{\underline {\boldsymbol{0.909} } } $ $\underline{\underline {\boldsymbol{0.903} } } $ $\underline{\underline {\boldsymbol{0.871} } } $ $\underline{\underline {\boldsymbol{0.959} } } $
Imp/% 15.38 21.33 6.68 18.08 23.67 10.31 4.03 5.19 2.23
表 4  TA-GAT在时序知识图谱上的对齐结果
模型 高时间敏感度 低时间敏感度 总计
MRR Hits@1 Hits@10 MRR Hits@1 Hits@10 MRR Hits@1 Hits@10
TA-GAT 0.805 0.797 0.892 0.331 0.284 0.419 0.503 0.470 0.590
TU-GAT 0.700 0.639 0.818 0.314 0.264 0.411 0.454 0.400 0.558
表 5  YAGO-WIKI20K在不同时间敏感度数据集上的EA结果
1 SUN Z, ZHANG Q, HU W, et al A benchmarking study of embedding-based entity alignment for knowledge graphs[J]. Proceedings of the VLDB Endowment, VLDBJ, 2020, 13 (11): 2326- 2340
2 MAO X, WANG W T, LAN M, et al. MRAEA: an efficient and robust entity alignment approach for cross-lingual knowledge graph [C]// The 13th ACM International Conference on Web Search and Data Mining. Houston: ACM, 2020: 420-428.
3 WANG Z, LV Q, LAN X, et al. Cross-lingual knowledge graph alignment via graph convolutional networks [C]// Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing. Brussels: Association for Computational Linguistics, 2018: 349-357.
4 WU Y, LIU X, FENG Y, et al. Jointly learning entity and relation representations for entity alignment [M]// Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing. Hong Kong: Association for Computational Linguistics, 2019: 240-249.
5 DASGUPTA S S, RAY S N, TALUKDAR P P. Hyte: hyperplane-based temporally aware knowledge graph embedding [C]// Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing. Brussels: Association for Computational Linguistics, 2018: 2001-2011.
6 BORDES A, USUNIER N, GARCÍA-DURÁN A, et al. Translating embeddings for modeling multi-relational data [C]// 27th Annual Conference on Neural Information Processing Systems. Lake Tahoe: [s.n.], 2013: 2787-2795.
7 GARCÍA-DURÁN A, DUMANCIC S, NIEPERT M. Learning sequence encoders for temporal knowledge graph completion [M]// Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing. Brussels: Association for Computational Linguistics, 2018: 4816-4821.
8 LACROIX T, OBOZINSKI G, USUNIER N. Tensor decompositions for temporal knowledge base completion [C]// 8th International Conference on Learning Representations. Addis Ababa: [s.n.], 2020.
9 GOEL R, KAZEMI S M, BRUBAKER M A, et al. Diachronic embedding for temporal knowledge graph completion [C]// The 34th AAAI Conference on Artificial Intelligence. New York: AAAI Press, 2020: 3988-3995.
10 XU C, NAYYERI M, ALKHOURY F, et al. Temporal knowledge graph embedding model based on additive time series decomposition [EB/OL]. [2022-07-15]. https://arXiv.org/abs/1911.07893v1.
11 WU J, CAO M, CHEUNG J C K, et al. Temp: temporal message passing for temporal knowledge graph completion [M]// Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing. [s.l.]: Association for Computational Linguistics, 2020: 5730-5746.
12 CHEN M, TIAN Y, YANG M, et al. Multilingual knowledge graph embeddings for cross-lingual knowledge alignment [C]// Proceedings of the 26th International Joint Conference on Artificial Intelligence. Melbourne: [s.n.], 2017: 1511-1517.
13 ZHU H, XIE R, LIU Z, et al. Iterative entity alignment via joint knowledge embeddings [C]// Proceedings of the 26th International Joint Conference on Artificial Intelligence. Melbourne: [s.n.], 2017: 4258-4264.
14 GUO L, SUN Z, HU W. Learning to exploit long-term relational dependencies in knowledge graphs [C]// Proceedings of the 36th International Conference on Machine Learning. Long Beach: PMLR, 2019: 2505-2514.
15 CHEN M, TIAN Y, CHANG K, et al. Co-training embeddings of knowledge graphs and entity descriptions for cross-lingual entity alignment [C]// Proceedings of the 27th International Joint Conference on Artificial Intelligence. Stockholm: [s.n.], 2018: 3998-4004.
16 LIU F, CHEN M, ROTH D, et al. Visual pivoting for (unsupervised) entity alignment [C]// 25th AAAI Conference on Artificial Intelligence. [s.l.]: AAAI Press, 2021: 4257-4266.
17 LAMPLE G, CONNEAU A, RANZATO M, et al. Word translation without parallel data [C]// 6th International Conference on Learning Representations. Vancouver: ICML, 2018.
18 SADEGHIAN A, ARMANDPOUR M, COLAS A, et al. Chronor: rotation based temporal knowledge graph embedding [M]// 35th AAAI Conference on Artificial Intelligence. [s.l.]: AAAI Press, 2021 : 6471-6479.
19 SUN Z, HU W, LI C. Cross-lingual entity alignment via joint attribute-preserving embedding [C]// 16th International Semantic Web Conference. Vienna: Springer, 2017: 628-644.
20 SUN Z, HU W, ZHANG Q, et al. Bootstrapping entity alignment with knowledge graph embedding [C]// Proceedings of the 27th International Joint Conference on Artificial Intelligence. Stockholm: [s.n.], 2018: 4396-4402.
21 CAO Y, LIU Z, LI C, et al. Multi-channel graph neural network for entity alignment [M]// Proceedings of the 57th Conference of the Association for Computational Linguistics. Florence: Association for Computational Linguistics, 2019: 1452-1461.
[1] 马骏驰,迪骁鑫,段宗涛,唐蕾. 程序表示学习综述[J]. 浙江大学学报(工学版), 2023, 57(1): 155-169.
[2] 陈成,张皞,李永强,冯远静. 关系生成图注意力网络的知识图谱链接预测[J]. 浙江大学学报(工学版), 2022, 56(5): 1025-1034.
[3] 赵廷廷,王喆,卢奕南. 基于传播概率矩阵的异构信息网络表示学习[J]. 浙江大学学报(工学版), 2019, 53(3): 548-554.