Please wait a minute...
Journal of ZheJiang University (Engineering Science)  2025, Vol. 59 Issue (7): 1394-1402    DOI: 10.3785/j.issn.1008-973X.2025.07.007
    
Structure-aware model for few-shot knowledge completion
Rongtai YANG(),Yubin SHAO*(),Qingzhi DU
Faculty of Information Engineering and Automation, Kunming University of Science and Technology, Kunming 650500, China
Download: HTML     PDF(857KB) HTML
Export: BibTeX | EndNote (RIS)      

Abstract  

A new model for few-shot knowledge completion was proposed to address the problem that existing knowledge completion models fail to adequately perceive the neighborhood topological structure during knowledge representation. A structure-aware encoder based on an attention mechanism was employed to encode triples in the process of knowledge representation. The cluster partitioning of neighboring nodes was performed by the encoder based on topological structure, and the structural information from each cluster in the neighborhood was integrated to reinforce the model’s structure-aware ability. An adaptive prediction network was adopted to compute the scores of triples to be predicted, for enhancing the stability of the model’s predictions. Experiments in the NELL-One and Wiki-One datasets were carried out. Results show that, compared with the baseline models, the proposed model outperforms by 0.018, 0.021, 0.024, 0.016 and 0.019, 0.055, 0.039, 0.038 in terms of mean reciprocal rank and the hit rate of correct results ranked in the top 10, top 5, and top 1 metrics, respectively. The proposed model effectively leverages topological information from neighborhoods, thereby enhancing the accuracy of knowledge completion.



Key wordsfew-shot knowledge completion      knowledge representation      structure-aware      topological structure      attention mechanism     
Received: 04 May 2024      Published: 25 July 2025
CLC:  TP 391  
Fund:  云南省媒体融合重点实验室项目(220235205).
Corresponding Authors: Yubin SHAO     E-mail: rongtaiyangmse@163.com;shaoyubin999@qq.com
Cite this article:

Rongtai YANG,Yubin SHAO,Qingzhi DU. Structure-aware model for few-shot knowledge completion. Journal of ZheJiang University (Engineering Science), 2025, 59(7): 1394-1402.

URL:

https://www.zjujournals.com/eng/10.3785/j.issn.1008-973X.2025.07.007     OR     https://www.zjujournals.com/eng/Y2025/V59/I7/1394


基于结构感知的少样本知识补全

现有的知识补全模型在知识表示过程中不能较好地感知邻域拓扑结构,为此提出新的少样本知识补全模型. 在知识表示过程中采用基于注意力机制的结构感知编码器进行三元组编码;该编码器以拓扑结构对邻域内的节点进行集群划分,通过融合邻域内各个集群的结构信息来增强模型的结构感知能力. 为了提升模型预测的稳定性,采用自适应预测网络计算待预测三元组的得分. 在NELL-One 和 Wiki-One 数据集中开展模型性能对比实验. 结果表明,对比基线模型,所提模型的平均倒数排名和命中排名在前10位、前5位、前1位的正确结果的平均占比分别提升了0.018、0.021、0.024、0.016 以及 0.019、0.055、0.039、0.038. 所提模型能够有效利用邻域的拓扑信息,提升知识补全的准确性.


关键词: 少样本知识补全,  知识表示,  结构感知,  拓扑结构,  注意力机制 
Fig.1 Semantic distribution and structural characteristics of neighborhood nodes for entity
Fig.2 Framework of structure-aware model for few-shot knowledge completion
数据集$N_{\mathrm{S}} $
实体关系三元组
NELL-One68 545358181 109
Wiki-One4 838 2448225 859 240
Tab.1 Samples of experimental dataset
模型类型模型名称NELL-OneWiki-One
MRRHits@10Hits@5Hits@1MRRHits@10Hits@5Hits@1
基于度量学习MFEN0.2260.4010.2870.1790.2680.3720.3220.201
TransAM0.2150.3480.2450.1640.3020.3460.3410.264
APINet0.3050.4960.4050.2080.3420.4730.4190.283
基于元学习GANA0.3110.4810.4130.2210.3220.4180.3790.276
ADK-KG0.3020.3790.3010.2240.2650.3460.2830.231
DARL0.2130.3740.3260.1380.3450.4460.4000.290
基于度量学习SAM0.3290.5170.4370.2400.3640.5280.4580.328
Tab.2 Few-shot link prediction results of different models in two datasets
M1M2MRRHits@10Hits@5Hits@1
0.3640.5280.4580.328
0.3380.4420.4060.281
0.3540.4890.4430.317
Tab.3 Impact of core modules on knowledge completion
Fig.3 Density of topology graph with different maximum number of sampled neighbors
Fig.4 Knowledge completion effect under different maximum number of sampled neighbors
[1]   杨东华, 何涛, 王宏志, 等 面向知识图谱的图嵌入学习研究进展[J]. 软件学报, 2022, 33 (9): 3370- 3390
YANG Donghua, HE Tao, WANG Hongzhi, et al Survey on knowledge graph embedding learning[J]. Journal of Software, 2022, 33 (9): 3370- 3390
[2]   张天成, 田雪, 孙相会, 等 知识图谱嵌入技术研究综述[J]. 软件学报, 2023, 34 (1): 277- 311
ZHANG Tiancheng, TIAN Xue, SUN Xianghui, et al Overview on knowledge graph embedding technology research[J]. Journal of Software, 2023, 34 (1): 277- 311
[3]   ZHONG L, WU J, LI Q, et al A comprehensive survey on automatic knowledge graph construction[J]. ACM Computing Surveys, 2024, 56 (4): 1- 62
[4]   ZHANG N, DENG S, SUN Z, et al. Long-tail relation extraction via knowledge graph embeddings and graph convolution networks [C]// Proceedings of the 2019 Conference of the North. [S.l.]: ACL, 2019: 3016–3025.
[5]   WU T, MA H, WANG C, et al Heterogeneous representation learning and matching for few-shot relation prediction[J]. Pattern Recognition, 2022, 131: 108830
doi: 10.1016/j.patcog.2022.108830
[6]   赵凯琳, 靳小龙, 王元卓 小样本学习研究综述[J]. 软件学报, 2021, 32 (2): 349- 369
ZHAO Kailin, JIN Xiaolong, WANG Yuanzhuo Survey on few-shot learning[J]. Journal of Software, 2021, 32 (2): 349- 369
[7]   LU J, GONG P, YE J, et al A survey on machine learning from few samples[J]. Pattern Recognition, 2023, 139: 109480
doi: 10.1016/j.patcog.2023.109480
[8]   HUANG Q, REN H, LESKOVEC J. Few-shot relational reasoning via connection subgraph pretraining [C]// Proceedings of the 36th International Conference on Neural Information Processing Systems. New York: Curran Associates, Inc., 2022: 6397–6409.
[9]   YUAN X, XU C, LI P, et al. Relational learning with hierarchical attention encoder and recoding validator for few-shot knowledge graph completion [C]// Proceedings of the 37th ACM/SIGAPP Symposium on Applied Computing. [S.l.]: ACM, 2022: 786–794.
[10]   LIANG Y, ZHAO S, CHENG B, et al TransAM: transformer appending matcher for few-shot knowledge graph completion[J]. Neurocomputing, 2023, 537: 61- 72
doi: 10.1016/j.neucom.2023.03.049
[11]   LI Y, YU K, ZHANG Y, et al Adaptive prototype interaction network for few-shot knowledge graph completion[J]. IEEE Transactions on Neural Networks and Learning Systems, 2024, 35 (11): 15237- 15250
doi: 10.1109/TNNLS.2023.3283545
[12]   NIU G, LI Y, TANG C, et al. Relational learning with gated and attentive neighbor aggregator for few-shot knowledge graph completion [C]// Proceedings of the 44th International ACM SIGIR Conference on Research and Development in Information Retrieval. [S. l.]: ACM, 2021: 213–222.
[13]   ZHANG Y, QIAN Y, YE Y, et al. Adapting distilled knowledge for few-shot relation reasoning over knowledge graphs [C]// Proceedings of the 2022 SIAM International Conference on Data Mining. Philadelphia: Society for Industrial and Applied Mathematics, 2022: 666–674.
[14]   CAI L, WANG L, YUAN R, et al Meta-learning based dynamic adaptive relation learning for few-shot knowledge graph completion[J]. Big Data Research, 2023, 33: 100394
doi: 10.1016/j.bdr.2023.100394
[15]   DWIVEDI V P, BRESSON X. A generalization of transformer networks to graphs [EB/OL]. (2021−01−24) [2024−05−03]. https://arxiv.org/pdf/2012.09699.
[16]   YING C, CAI T, LUO S, et al. Do transformers really perform badly for graph representation [C]// 35th Conference on Neural Information Processing Systems. Sydney: [s.n.], 2021: 1–19.
[17]   CHEN D, O’BRAY L, BORGWARDT K. Structure-aware transformer for graph representation learning [C]// Proceedings of the 39th International Conference on Machine Learning. Baltimore: PMLR, 2022: 3469–3489.
[18]   LUO Y, THOST V, SHI L. Transformers over directed acyclic graphs [C]// 37th Conference on Neural Information Processing Systems. [S.l.]: Curran Associates, Inc., 2023: 47764–47782.
[19]   LI Y, YU K, HUANG X, et al. Learning inter-entity-interaction for few-shot knowledge graph completion [C]// Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing. [S. l.]: ACL, 2022: 7691–7700.
[20]   SHENG J, GUO S, CHEN Z, et al. Adaptive attentional network for few-shot knowledge graph completion [C]// Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing. [S. l.]: ACL, 2020: 1681–1691.
[21]   陈成, 张皞, 李永强, 等 关系生成图注意力网络的知识图谱链接预测[J]. 浙江大学学报: 工学版, 2022, 56 (5): 1025- 1034
CHEN Cheng, ZHANG Hao, LI Yongqiang, et al Knowledge graph link prediction based on relational generative graph attention network[J]. Journal of Zhejiang University: Engineering Science, 2022, 56 (5): 1025- 1034
[22]   VELIČKOVIĆ P, CUCURULL G, CASANOVA A, et al. Graph attention networks [C]// International Conference on Learning Representations. Vancouver: [s.n.], 2018: 1–12
[23]   WANG X, HE X, CAO Y, et al. KGAT: knowledge graph attention network for recommendation [C]// Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. [S. l.]: ACM, 2019: 950–958.
[24]   YE Y, JI S Sparse graph attention networks[J]. IEEE Transactions on Knowledge and Data Engineering, 2023, 35 (1): 905- 916
[25]   SU G, WANG H, ZHANG Y, et al Simple and deep graph attention networks[J]. Knowledge-Based Systems, 2024, 293: 111649
doi: 10.1016/j.knosys.2024.111649
[26]   YUAN H, JI S. Structpool: structured graph pooling via conditional random fields [C]// International Conference on Learning Representations. Addis Ababa: [s.n.], 2020: 1–12.
[27]   BIANCHI F M, GRATTAROLA D, ALIPPI C. Spectral clustering with graph neural networks for graph pooling [C]// Proceedings of the 37th International Conference on Machine Learning. Vienna: [s. n.], 2020: 1–13.
[28]   BAEK J, KANG M, HWANG S J. Accurate learning of graph representations with graph multiset pooling [EB/OL]. (2021−06−28)[2024−05−03]. https://arxiv.org/pdf/2102.11533.
[1] Yongqing CAI,Cheng HAN,Wei QUAN,Wudi CHEN. Visual induced motion sickness estimation model based on attention mechanism[J]. Journal of ZheJiang University (Engineering Science), 2025, 59(6): 1110-1118.
[2] Wenbo JU,Huajun DONG. Motherboard defect detection method based on context information fusion and dynamic sampling[J]. Journal of ZheJiang University (Engineering Science), 2025, 59(6): 1159-1168.
[3] Xiangyu ZHOU,Yizhi LIU,Yijiang ZHAO,Zhuhua LIAO,Decheng ZHANG. Hierarchical spatial embedding BiGRU model for destination prediction[J]. Journal of ZheJiang University (Engineering Science), 2025, 59(6): 1211-1218.
[4] Zongmin LI,Chang XU,Yun BAI,Shiyang XIAN,Guangcai RONG. Dual-neighborhood graph convolution method for point cloud understanding[J]. Journal of ZheJiang University (Engineering Science), 2025, 59(5): 879-889.
[5] Hongwei LIU,Lei WANG,Yang LIU,Pengchao ZHANG,Shi QIAO. Short term load forecasting based on recombination quadratic decomposition and LSTNet-Atten[J]. Journal of ZheJiang University (Engineering Science), 2025, 59(5): 1051-1062.
[6] Dengfeng LIU,Wenjing GUO,Shihai CHEN. Content-guided attention-based lane detection network[J]. Journal of ZheJiang University (Engineering Science), 2025, 59(3): 451-459.
[7] Minghui YAO,Yueyan WANG,Qiliang WU,Yan NIU,Cong WANG. Siamese networks algorithm based on small human motion behavior recognition[J]. Journal of ZheJiang University (Engineering Science), 2025, 59(3): 504-511.
[8] Xianglei YIN,Shaopeng QU,Yongfang XIE,Ni SU. Occluded bird nest detection based on asymptotic feature fusion and multi-scale dilated attention[J]. Journal of ZheJiang University (Engineering Science), 2025, 59(3): 535-545.
[9] Yali XUE,Yiming HE,Shan CUI,Quan OUYANG. Oriented ship detection algorithm in SAR image based on improved YOLOv5[J]. Journal of ZheJiang University (Engineering Science), 2025, 59(2): 261-268.
[10] Canlin LI,Xinyue WANG,Lizhuang MA,Zhiwen SHAO,Wenjiao ZHANG. Image cartoonization incorporating attention mechanism and structural line extraction[J]. Journal of ZheJiang University (Engineering Science), 2024, 58(8): 1728-1737.
[11] Zhongliang LI,Qi CHEN,Lin SHI,Chao YANG,Xianming ZOU. Dynamic knowledge graph completion of temporal aware combination[J]. Journal of ZheJiang University (Engineering Science), 2024, 58(8): 1738-1747.
[12] Shuhan WU,Dan WANG,Yuanfang CHEN,Ziyu JIA,Yueqi ZHANG,Meng XU. Attention-fused filter bank dual-view graph convolution motor imagery EEG classification[J]. Journal of ZheJiang University (Engineering Science), 2024, 58(7): 1326-1335.
[13] Xianwei MA,Chaohui FAN,Weizhi NIE,Dong LI,Yiqun ZHU. Robust fault diagnosis method for failure sensors[J]. Journal of ZheJiang University (Engineering Science), 2024, 58(7): 1488-1497.
[14] Jun YANG,Chen ZHANG. Semantic segmentation of 3D point cloud based on boundary point estimation and sparse convolution neural network[J]. Journal of ZheJiang University (Engineering Science), 2024, 58(6): 1121-1132.
[15] Yuntang LI,Hengjie LI,Kun ZHANG,Binrui WANG,Shanyue GUAN,Yuan CHEN. Recognition of complex power lines based on novel encoder-decoder network[J]. Journal of ZheJiang University (Engineering Science), 2024, 58(6): 1133-1141.