|
|
Structure-aware model for few-shot knowledge completion |
Rongtai YANG( ),Yubin SHAO*( ),Qingzhi DU |
Faculty of Information Engineering and Automation, Kunming University of Science and Technology, Kunming 650500, China |
|
|
Abstract A new model for few-shot knowledge completion was proposed to address the problem that existing knowledge completion models fail to adequately perceive the neighborhood topological structure during knowledge representation. A structure-aware encoder based on an attention mechanism was employed to encode triples in the process of knowledge representation. The cluster partitioning of neighboring nodes was performed by the encoder based on topological structure, and the structural information from each cluster in the neighborhood was integrated to reinforce the model’s structure-aware ability. An adaptive prediction network was adopted to compute the scores of triples to be predicted, for enhancing the stability of the model’s predictions. Experiments in the NELL-One and Wiki-One datasets were carried out. Results show that, compared with the baseline models, the proposed model outperforms by 0.018, 0.021, 0.024, 0.016 and 0.019, 0.055, 0.039, 0.038 in terms of mean reciprocal rank and the hit rate of correct results ranked in the top 10, top 5, and top 1 metrics, respectively. The proposed model effectively leverages topological information from neighborhoods, thereby enhancing the accuracy of knowledge completion.
|
Received: 04 May 2024
Published: 25 July 2025
|
|
Fund: 云南省媒体融合重点实验室项目(220235205). |
Corresponding Authors:
Yubin SHAO
E-mail: rongtaiyangmse@163.com;shaoyubin999@qq.com
|
基于结构感知的少样本知识补全
现有的知识补全模型在知识表示过程中不能较好地感知邻域拓扑结构,为此提出新的少样本知识补全模型. 在知识表示过程中采用基于注意力机制的结构感知编码器进行三元组编码;该编码器以拓扑结构对邻域内的节点进行集群划分,通过融合邻域内各个集群的结构信息来增强模型的结构感知能力. 为了提升模型预测的稳定性,采用自适应预测网络计算待预测三元组的得分. 在NELL-One 和 Wiki-One 数据集中开展模型性能对比实验. 结果表明,对比基线模型,所提模型的平均倒数排名和命中排名在前10位、前5位、前1位的正确结果的平均占比分别提升了0.018、0.021、0.024、0.016 以及 0.019、0.055、0.039、0.038. 所提模型能够有效利用邻域的拓扑信息,提升知识补全的准确性.
关键词:
少样本知识补全,
知识表示,
结构感知,
拓扑结构,
注意力机制
|
|
[1] |
杨东华, 何涛, 王宏志, 等 面向知识图谱的图嵌入学习研究进展[J]. 软件学报, 2022, 33 (9): 3370- 3390 YANG Donghua, HE Tao, WANG Hongzhi, et al Survey on knowledge graph embedding learning[J]. Journal of Software, 2022, 33 (9): 3370- 3390
|
|
|
[2] |
张天成, 田雪, 孙相会, 等 知识图谱嵌入技术研究综述[J]. 软件学报, 2023, 34 (1): 277- 311 ZHANG Tiancheng, TIAN Xue, SUN Xianghui, et al Overview on knowledge graph embedding technology research[J]. Journal of Software, 2023, 34 (1): 277- 311
|
|
|
[3] |
ZHONG L, WU J, LI Q, et al A comprehensive survey on automatic knowledge graph construction[J]. ACM Computing Surveys, 2024, 56 (4): 1- 62
|
|
|
[4] |
ZHANG N, DENG S, SUN Z, et al. Long-tail relation extraction via knowledge graph embeddings and graph convolution networks [C]// Proceedings of the 2019 Conference of the North. [S.l.]: ACL, 2019: 3016–3025.
|
|
|
[5] |
WU T, MA H, WANG C, et al Heterogeneous representation learning and matching for few-shot relation prediction[J]. Pattern Recognition, 2022, 131: 108830
doi: 10.1016/j.patcog.2022.108830
|
|
|
[6] |
赵凯琳, 靳小龙, 王元卓 小样本学习研究综述[J]. 软件学报, 2021, 32 (2): 349- 369 ZHAO Kailin, JIN Xiaolong, WANG Yuanzhuo Survey on few-shot learning[J]. Journal of Software, 2021, 32 (2): 349- 369
|
|
|
[7] |
LU J, GONG P, YE J, et al A survey on machine learning from few samples[J]. Pattern Recognition, 2023, 139: 109480
doi: 10.1016/j.patcog.2023.109480
|
|
|
[8] |
HUANG Q, REN H, LESKOVEC J. Few-shot relational reasoning via connection subgraph pretraining [C]// Proceedings of the 36th International Conference on Neural Information Processing Systems. New York: Curran Associates, Inc., 2022: 6397–6409.
|
|
|
[9] |
YUAN X, XU C, LI P, et al. Relational learning with hierarchical attention encoder and recoding validator for few-shot knowledge graph completion [C]// Proceedings of the 37th ACM/SIGAPP Symposium on Applied Computing. [S.l.]: ACM, 2022: 786–794.
|
|
|
[10] |
LIANG Y, ZHAO S, CHENG B, et al TransAM: transformer appending matcher for few-shot knowledge graph completion[J]. Neurocomputing, 2023, 537: 61- 72
doi: 10.1016/j.neucom.2023.03.049
|
|
|
[11] |
LI Y, YU K, ZHANG Y, et al Adaptive prototype interaction network for few-shot knowledge graph completion[J]. IEEE Transactions on Neural Networks and Learning Systems, 2024, 35 (11): 15237- 15250
doi: 10.1109/TNNLS.2023.3283545
|
|
|
[12] |
NIU G, LI Y, TANG C, et al. Relational learning with gated and attentive neighbor aggregator for few-shot knowledge graph completion [C]// Proceedings of the 44th International ACM SIGIR Conference on Research and Development in Information Retrieval. [S. l.]: ACM, 2021: 213–222.
|
|
|
[13] |
ZHANG Y, QIAN Y, YE Y, et al. Adapting distilled knowledge for few-shot relation reasoning over knowledge graphs [C]// Proceedings of the 2022 SIAM International Conference on Data Mining. Philadelphia: Society for Industrial and Applied Mathematics, 2022: 666–674.
|
|
|
[14] |
CAI L, WANG L, YUAN R, et al Meta-learning based dynamic adaptive relation learning for few-shot knowledge graph completion[J]. Big Data Research, 2023, 33: 100394
doi: 10.1016/j.bdr.2023.100394
|
|
|
[15] |
DWIVEDI V P, BRESSON X. A generalization of transformer networks to graphs [EB/OL]. (2021−01−24) [2024−05−03]. https://arxiv.org/pdf/2012.09699.
|
|
|
[16] |
YING C, CAI T, LUO S, et al. Do transformers really perform badly for graph representation [C]// 35th Conference on Neural Information Processing Systems. Sydney: [s.n.], 2021: 1–19.
|
|
|
[17] |
CHEN D, O’BRAY L, BORGWARDT K. Structure-aware transformer for graph representation learning [C]// Proceedings of the 39th International Conference on Machine Learning. Baltimore: PMLR, 2022: 3469–3489.
|
|
|
[18] |
LUO Y, THOST V, SHI L. Transformers over directed acyclic graphs [C]// 37th Conference on Neural Information Processing Systems. [S.l.]: Curran Associates, Inc., 2023: 47764–47782.
|
|
|
[19] |
LI Y, YU K, HUANG X, et al. Learning inter-entity-interaction for few-shot knowledge graph completion [C]// Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing. [S. l.]: ACL, 2022: 7691–7700.
|
|
|
[20] |
SHENG J, GUO S, CHEN Z, et al. Adaptive attentional network for few-shot knowledge graph completion [C]// Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing. [S. l.]: ACL, 2020: 1681–1691.
|
|
|
[21] |
陈成, 张皞, 李永强, 等 关系生成图注意力网络的知识图谱链接预测[J]. 浙江大学学报: 工学版, 2022, 56 (5): 1025- 1034 CHEN Cheng, ZHANG Hao, LI Yongqiang, et al Knowledge graph link prediction based on relational generative graph attention network[J]. Journal of Zhejiang University: Engineering Science, 2022, 56 (5): 1025- 1034
|
|
|
[22] |
VELIČKOVIĆ P, CUCURULL G, CASANOVA A, et al. Graph attention networks [C]// International Conference on Learning Representations. Vancouver: [s.n.], 2018: 1–12
|
|
|
[23] |
WANG X, HE X, CAO Y, et al. KGAT: knowledge graph attention network for recommendation [C]// Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. [S. l.]: ACM, 2019: 950–958.
|
|
|
[24] |
YE Y, JI S Sparse graph attention networks[J]. IEEE Transactions on Knowledge and Data Engineering, 2023, 35 (1): 905- 916
|
|
|
[25] |
SU G, WANG H, ZHANG Y, et al Simple and deep graph attention networks[J]. Knowledge-Based Systems, 2024, 293: 111649
doi: 10.1016/j.knosys.2024.111649
|
|
|
[26] |
YUAN H, JI S. Structpool: structured graph pooling via conditional random fields [C]// International Conference on Learning Representations. Addis Ababa: [s.n.], 2020: 1–12.
|
|
|
[27] |
BIANCHI F M, GRATTAROLA D, ALIPPI C. Spectral clustering with graph neural networks for graph pooling [C]// Proceedings of the 37th International Conference on Machine Learning. Vienna: [s. n.], 2020: 1–13.
|
|
|
[28] |
BAEK J, KANG M, HWANG S J. Accurate learning of graph representations with graph multiset pooling [EB/OL]. (2021−06−28)[2024−05−03]. https://arxiv.org/pdf/2102.11533.
|
|
|
|
Viewed |
|
|
|
Full text
|
|
|
|
|
Abstract
|
|
|
|
|
Cited |
|
|
|
|
|
Shared |
|
|
|
|
|
Discussed |
|
|
|
|