计算机技术与控制工程 |
|
|
|
|
基于结构感知的少样本知识补全 |
杨荣泰( ),邵玉斌*( ),杜庆治 |
昆明理工大学 信息工程与自动化学院,云南 昆明 650500 |
|
Structure-aware model for few-shot knowledge completion |
Rongtai YANG( ),Yubin SHAO*( ),Qingzhi DU |
Faculty of Information Engineering and Automation, Kunming University of Science and Technology, Kunming 650500, China |
1 |
杨东华, 何涛, 王宏志, 等 面向知识图谱的图嵌入学习研究进展[J]. 软件学报, 2022, 33 (9): 3370- 3390 YANG Donghua, HE Tao, WANG Hongzhi, et al Survey on knowledge graph embedding learning[J]. Journal of Software, 2022, 33 (9): 3370- 3390
|
2 |
张天成, 田雪, 孙相会, 等 知识图谱嵌入技术研究综述[J]. 软件学报, 2023, 34 (1): 277- 311 ZHANG Tiancheng, TIAN Xue, SUN Xianghui, et al Overview on knowledge graph embedding technology research[J]. Journal of Software, 2023, 34 (1): 277- 311
|
3 |
ZHONG L, WU J, LI Q, et al A comprehensive survey on automatic knowledge graph construction[J]. ACM Computing Surveys, 2024, 56 (4): 1- 62
|
4 |
ZHANG N, DENG S, SUN Z, et al. Long-tail relation extraction via knowledge graph embeddings and graph convolution networks [C]// Proceedings of the 2019 Conference of the North. [S.l.]: ACL, 2019: 3016–3025.
|
5 |
WU T, MA H, WANG C, et al Heterogeneous representation learning and matching for few-shot relation prediction[J]. Pattern Recognition, 2022, 131: 108830
doi: 10.1016/j.patcog.2022.108830
|
6 |
赵凯琳, 靳小龙, 王元卓 小样本学习研究综述[J]. 软件学报, 2021, 32 (2): 349- 369 ZHAO Kailin, JIN Xiaolong, WANG Yuanzhuo Survey on few-shot learning[J]. Journal of Software, 2021, 32 (2): 349- 369
|
7 |
LU J, GONG P, YE J, et al A survey on machine learning from few samples[J]. Pattern Recognition, 2023, 139: 109480
doi: 10.1016/j.patcog.2023.109480
|
8 |
HUANG Q, REN H, LESKOVEC J. Few-shot relational reasoning via connection subgraph pretraining [C]// Proceedings of the 36th International Conference on Neural Information Processing Systems. New York: Curran Associates, Inc., 2022: 6397–6409.
|
9 |
YUAN X, XU C, LI P, et al. Relational learning with hierarchical attention encoder and recoding validator for few-shot knowledge graph completion [C]// Proceedings of the 37th ACM/SIGAPP Symposium on Applied Computing. [S.l.]: ACM, 2022: 786–794.
|
10 |
LIANG Y, ZHAO S, CHENG B, et al TransAM: transformer appending matcher for few-shot knowledge graph completion[J]. Neurocomputing, 2023, 537: 61- 72
doi: 10.1016/j.neucom.2023.03.049
|
11 |
LI Y, YU K, ZHANG Y, et al Adaptive prototype interaction network for few-shot knowledge graph completion[J]. IEEE Transactions on Neural Networks and Learning Systems, 2024, 35 (11): 15237- 15250
doi: 10.1109/TNNLS.2023.3283545
|
12 |
NIU G, LI Y, TANG C, et al. Relational learning with gated and attentive neighbor aggregator for few-shot knowledge graph completion [C]// Proceedings of the 44th International ACM SIGIR Conference on Research and Development in Information Retrieval. [S. l.]: ACM, 2021: 213–222.
|
13 |
ZHANG Y, QIAN Y, YE Y, et al. Adapting distilled knowledge for few-shot relation reasoning over knowledge graphs [C]// Proceedings of the 2022 SIAM International Conference on Data Mining. Philadelphia: Society for Industrial and Applied Mathematics, 2022: 666–674.
|
14 |
CAI L, WANG L, YUAN R, et al Meta-learning based dynamic adaptive relation learning for few-shot knowledge graph completion[J]. Big Data Research, 2023, 33: 100394
doi: 10.1016/j.bdr.2023.100394
|
15 |
DWIVEDI V P, BRESSON X. A generalization of transformer networks to graphs [EB/OL]. (2021−01−24) [2024−05−03]. https://arxiv.org/pdf/2012.09699.
|
16 |
YING C, CAI T, LUO S, et al. Do transformers really perform badly for graph representation [C]// 35th Conference on Neural Information Processing Systems. Sydney: [s.n.], 2021: 1–19.
|
17 |
CHEN D, O’BRAY L, BORGWARDT K. Structure-aware transformer for graph representation learning [C]// Proceedings of the 39th International Conference on Machine Learning. Baltimore: PMLR, 2022: 3469–3489.
|
18 |
LUO Y, THOST V, SHI L. Transformers over directed acyclic graphs [C]// 37th Conference on Neural Information Processing Systems. [S.l.]: Curran Associates, Inc., 2023: 47764–47782.
|
19 |
LI Y, YU K, HUANG X, et al. Learning inter-entity-interaction for few-shot knowledge graph completion [C]// Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing. [S. l.]: ACL, 2022: 7691–7700.
|
20 |
SHENG J, GUO S, CHEN Z, et al. Adaptive attentional network for few-shot knowledge graph completion [C]// Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing. [S. l.]: ACL, 2020: 1681–1691.
|
21 |
陈成, 张皞, 李永强, 等 关系生成图注意力网络的知识图谱链接预测[J]. 浙江大学学报: 工学版, 2022, 56 (5): 1025- 1034 CHEN Cheng, ZHANG Hao, LI Yongqiang, et al Knowledge graph link prediction based on relational generative graph attention network[J]. Journal of Zhejiang University: Engineering Science, 2022, 56 (5): 1025- 1034
|
22 |
VELIČKOVIĆ P, CUCURULL G, CASANOVA A, et al. Graph attention networks [C]// International Conference on Learning Representations. Vancouver: [s.n.], 2018: 1–12
|
23 |
WANG X, HE X, CAO Y, et al. KGAT: knowledge graph attention network for recommendation [C]// Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. [S. l.]: ACM, 2019: 950–958.
|
24 |
YE Y, JI S Sparse graph attention networks[J]. IEEE Transactions on Knowledge and Data Engineering, 2023, 35 (1): 905- 916
|
25 |
SU G, WANG H, ZHANG Y, et al Simple and deep graph attention networks[J]. Knowledge-Based Systems, 2024, 293: 111649
doi: 10.1016/j.knosys.2024.111649
|
26 |
YUAN H, JI S. Structpool: structured graph pooling via conditional random fields [C]// International Conference on Learning Representations. Addis Ababa: [s.n.], 2020: 1–12.
|
27 |
BIANCHI F M, GRATTAROLA D, ALIPPI C. Spectral clustering with graph neural networks for graph pooling [C]// Proceedings of the 37th International Conference on Machine Learning. Vienna: [s. n.], 2020: 1–13.
|
28 |
BAEK J, KANG M, HWANG S J. Accurate learning of graph representations with graph multiset pooling [EB/OL]. (2021−06−28)[2024−05−03]. https://arxiv.org/pdf/2102.11533.
|
|
Viewed |
|
|
|
Full text
|
|
|
|
|
Abstract
|
|
|
|
|
Cited |
|
|
|
|
|
Shared |
|
|
|
|
|
Discussed |
|
|
|
|