Processing math: 100%
We use cookies to improve your experience with our site.

Meta-Learning Based Few-Shot Link Prediction for Emerging Knowledge Graph

Yu-Feng Zhang, Wei Chen, Peng-Peng Zhao, Jia-Jie Xu, Jun-Hua Fang, Lei Zhao

downloadPDF
张钰峰, 陈伟, 赵朋朋, 许佳捷, 房俊华, 赵雷. 基于元学习的少样本增量知识图谱链接预测[J]. 计算机科学技术学报, 2024, 39(5): 1058-1077. DOI: 10.1007/s11390-024-2863-8
引用本文: 张钰峰, 陈伟, 赵朋朋, 许佳捷, 房俊华, 赵雷. 基于元学习的少样本增量知识图谱链接预测[J]. 计算机科学技术学报, 2024, 39(5): 1058-1077. DOI: 10.1007/s11390-024-2863-8
Zhang YF, Chen W, Zhao PP et al. Meta-learning based few-shot link prediction for emerging knowledge graph. JOURNAL OF COMPUTER SCIENCE AND TECHNOLOGY 39(5): 1058−1077 Sept. 2024. DOI: 10.1007/s11390-024-2863-8.
Citation: Zhang YF, Chen W, Zhao PP et al. Meta-learning based few-shot link prediction for emerging knowledge graph. JOURNAL OF COMPUTER SCIENCE AND TECHNOLOGY 39(5): 1058−1077 Sept. 2024. DOI: 10.1007/s11390-024-2863-8.
张钰峰, 陈伟, 赵朋朋, 许佳捷, 房俊华, 赵雷. 基于元学习的少样本增量知识图谱链接预测[J]. 计算机科学技术学报, 2024, 39(5): 1058-1077. CSTR: 32374.14.s11390-024-2863-8
引用本文: 张钰峰, 陈伟, 赵朋朋, 许佳捷, 房俊华, 赵雷. 基于元学习的少样本增量知识图谱链接预测[J]. 计算机科学技术学报, 2024, 39(5): 1058-1077. CSTR: 32374.14.s11390-024-2863-8
Zhang YF, Chen W, Zhao PP et al. Meta-learning based few-shot link prediction for emerging knowledge graph. JOURNAL OF COMPUTER SCIENCE AND TECHNOLOGY 39(5): 1058−1077 Sept. 2024. CSTR: 32374.14.s11390-024-2863-8.
Citation: Zhang YF, Chen W, Zhao PP et al. Meta-learning based few-shot link prediction for emerging knowledge graph. JOURNAL OF COMPUTER SCIENCE AND TECHNOLOGY 39(5): 1058−1077 Sept. 2024. CSTR: 32374.14.s11390-024-2863-8.

基于元学习的少样本增量知识图谱链接预测

Meta-Learning Based Few-Shot Link Prediction for Emerging Knowledge Graph

Funds: This work was supported by the National Natural Science Foundation of China under Grant No. 62272332 and the Major Program of the Natural Science Foundation of Jiangsu Higher Education Institutions of China under Grant No. 22KJA520006.
More Information
    Author Bio:

    Yu-Feng Zhang received his B.S. degree in software engineering from Soochow University, Suzhou, in 2020. He is currently a Ph.D. candidate at the School of Computer Science and Technology, Soochow University, Suzhou. His main research interests include knowledge graph, graph representation learning, and graph databases

    Wei Chen received his Ph.D. degree in computer science from Soochow University, Suzhou, in 2018. He is currently an associate professor at the School of Computer Science and Technology, Soochow University, Suzhou. His research interests include heterogeneous information network analysis, cross-platform linkage and recommendation, spatio-temporal database, and knowledge graph embedding and refinement

    Peng-Peng Zhao received his Ph.D. degree in computer science from Soochow University, Suzhou, in 2008. He is now a professor at the School of Computer Science and Technology, Soochow University, Suzhou. His current research interests include data mining, deep learning, big data analysis, and recommender systems

    Jia-Jie Xu received his M.S. degree from the University of Queensland, Brisbane, in 2006, and his Ph.D. degree from the Swinburne University of Technology, Melbourne, in 2011. He is currently a professor at the School of Computer Science and Technology, Soochow University, Suzhou. His research interests include spatiotemporal database systems, big data analytics, and recommendation systems

    Jun-Hua Fang received his Ph.D. degree in computer science from East China Normal University, Shanghai, in 2017. He is currently an associate professor at the School of Computer Science and Technology, Soochow University, Suzhou. His research interests mainly include spatio-temporal database, cloud computing and distributed stream processing

    Lei Zhao received his Ph.D. degree in computer science from Soochow University, Suzhou, in 2006. He is now a professor at the School of Computer Science and Technology, Soochow University, Suzhou. His recent research is to analyze large graph databases in an effective, efficient, and secure way

    Corresponding author:

    Wei Chen: robertchen@suda.edu.cn

    Wei Chen is the principal investigator of the two funding projects; Lei Zhao is the designer of the research framework.

  • 摘要:
    研究背景 

    知识图谱以语义三元组的形式把海量结构化、半结构化、非结构化信息表示为客观世界可认知的知识库,具有强大的语义表达、存储能力。尽管知识图谱中常常包含了大量知识条目,但其一般仍是不完整的且亟待补充的。为了解决这一问题,知识图谱的链接预测任务被提出,旨在对图谱中缺失的三元组进行预测,从而对整张图谱进行补全。近年来,基于神经网络的知识图谱表示学习方法已经在图谱链接预测任务上取得了不错的成绩。

    目的 

    现实世界中知识图谱并不是静态不变的,而是动态增长的。传统的知识图谱链接预测方法往往忽略了这一点,他们要求图谱中的实体与关系必须在训练时出现过,否则无法获得实体与关系对应的特征表示,因而无法在预测阶段对从未出现过的实体或关系进行预测。此外,当图谱中出现新增实体时,其相关联的图结构或语义信息往往较少,只能提供少量学习样本,这也极大限制了现有方法的表示能力。因此,我们希望提出一种新的知识图谱链接预测方法,可以适用于增量场景下少样本的知识图谱链接预测,使得图谱能更好的适用于现实应用场景。

    方法 

    本文设计了一种基于元学习的知识图谱表示学习方法,从全图抽取高阶语义特征来对增量场景下的新增实体进行表示,在图神经网络中引入记忆单元存储路径的语义信息,并利用元学习的方法在训练过程中模拟少样本进行训练,使得模型可以完成少样本下增量知识图谱的链接预测任务。

    结果 

    本文在由公开数据集FB15k-237与NELL-995所构建出的增量图谱数据集上对所提出的模型与所有基线进行比较,实验证明模型效果超过了所有对比基线,评价指标MRR在两个数据集上分别达到了0.397与0.291。消融实验证明了我们所提出的图特征传递模型和全局语义特征的有效性。此外,我们对模型的损失收敛效率以及训练和测试的效率均进行了研究与分析。

    结论 

    针对少样本下的增量知识图谱链接预测问题,本文提出了一种基于元学习的知识图谱表示学习方法,在图神经网络中引入记忆单元,抽取了图谱中的全局语义信息,并利用元学习框架模拟了少样本场景进行模型训练。实验证明该模型在少样本的条件下依旧可以在增量图谱的链接预测任务上取得不错的效果,且其在样本数充足时仍然适用。该方法可以帮助图谱更好的适用到现实应用场景中,例如可以帮助解决推荐系统冷启动问题,多轮对话的话题切换问题等等。

    Abstract:

    Inductive knowledge graph embedding (KGE) aims to embed unseen entities in emerging knowledge graphs (KGs). The major recent studies of inductive KGE embed unseen entities by aggregating information from their neighboring entities and relations with graph neural networks (GNNs). However, these methods rely on the existing neighbors of unseen entities and suffer from two common problems: data sparsity and feature smoothing. Firstly, the data sparsity problem means unseen entities usually emerge with few triplets containing insufficient information. Secondly, the effectiveness of the features extracted from original KGs will degrade when repeatedly propagating these features to represent unseen entities in emerging KGs, which is termed feature smoothing problem. To tackle the two problems, we propose a novel model entitled Meta-Learning Based Memory Graph Convolutional Network (MMGCN) consisting of three different components: 1) the two-layer information transforming module (TITM) developed to effectively transform information from original KGs to emerging KGs; 2) the hyper-relation feature initializing module (HFIM) proposed to extract type-level features shared between KGs and obtain a coarse-grained representation for each entity with these features; and 3) the meta-learning training module (MTM) designed to simulate the few-shot emerging KGs and train the model in a meta-learning framework. The extensive experiments conducted on the few-shot link prediction task for emerging KGs demonstrate the superiority of our proposed model MMGCN compared with state-of-the-art methods.

  • Figure  1.   Example of emerging knowledge graph of NBA 2016 playoffs.

    Figure  2.   Overviw of the proposed model MMGCN. The blue of different depths in coarse-grained representations indicate entries with different values.

    Figure  3.   Model detail of MGCN, where T and S are the activation functions as Tanh and Sigmoid, means embedding concatenation, × and + denote product and plus, respectively.

    Figure  4.   Overview of HFIM. (a) Constructing two abstract graphs. (b) Embedding lookup. (c) Aggregating features for entity ei.

    Figure  5.   Hit@10 results of ablation studies on FB15k-237 and NELL-995. (a) TITM. (b) HFIM.

    Figure  6.   Robustness (MRR) studies of different shots. (a) FB15k-237. (b) NELL-995.

    Figure  7.   Learning efficiency (loss convergence) study on the first 1000 epochs. (a) FB15k-237. (b) NELL-995.

    Figure  8.   Training and testing time of MMGCN and the compared baselines.

    Algorithm 1. Hyper-Relation Feature Modeling
    Input:
      original KG G
      hyper-relation features P
      feature aggregating network Ω
    Output: P, Ω
    1: Initialize P, Ω randomly;
    2: for each triplet (ei,rk,ej) in Gpos do
    3:  Sample the negative triplets (ei,rk,ej);
    4:  Encode the entities and relations in both positive and neg ative triplets using Ω;
    5:  Calculate scores for triplets using (5);
    6:  Minimize the loss Lh in (7);
    7: end for
    下载: 导出CSV
    Algorithm 2. Meta Learning
    Input:
      meta task set T
      initialized memory cell M
      model MMGCN Φ
    Output: Φ
    1: Initialize Φ randomly;
    2: for each (Si,Qi)T do
    3:  Input the support set Si into MMGCN to embed the un seen entities;
    4:  Sample the negative query set Qi;
    5:  Calculate scores for triplets using (8);
    6:  Minimize the loss Lm in (9);
    7: end for
    下载: 导出CSV

    Table  1   Statistics of Datasets

    Dataset FB15k-237 Nell-995
    |E| |E| |R| |G| |E| |E| |R| |G|
    Emerging KG Meta-train 2500 7078 236 72065 1500 8211 199 22345
    Meta-valid 1000 2617 232 6246 600 1830 187 3676
    Meta-test 1500 3377 232 9867 899 2907 193 5852
    Original KG Train / 9366 237 125324 / 57859 200 85373
    Valid / 6827 236 15666 / 11874 200 10672
    Test / 6876 237 15666 / 11923 198 10672
    Note: |E|, |E|, |R|, and |G| denote the number of unseen entities, seen entities, relations, and triplets, respectively.
    下载: 导出CSV

    Table  2   Main Results of 3-Shot Link Prediction for Emerging KGs on NELL-995 and FB15k-237

    Model FB15k-237 NELL-995
    MRR Hit@1 Hit@3 Hit@10 MRR Hit@1 Hit@3 Hit@10
    TransE[11] 0.129 0.061 0.140 0.246 0.115 0.059 0.136 0.230
    Distmult[12] 0.090 0.052 0.098 0.175 0.137 0.088 0.141 0.238
    RotatE[20] 0.110 0.062 0.129 0.197 0.115 0.066 0.130 0.209
    MEAN[14] 0.218 0.131 0.256 0.360 0.212 0.134 0.233 0.355
    GEN[37] 0.373 0.286 0.413 0.543 0.261 0.196 0.293 0.393
    Grail[44] 0.155 0.106 0.170 0.207 0.218 0.172 0.235 0.264
    TACT[45] 0.181 0.132 0.195 0.218 0.207 0.169 0.209 0.234
    MorsE[47] 0.259 0.168 0.289 0.447 0.188 0.104 0.217 0.361
    HRFN[18] 0.376 0.293 0.414 0.550 0.273 0.190 0.305 0.407
    MMGCN 0.397 0.309 0.440 0.567 0.291 0.221 0.330 0.425
    Note: Bold numbers denote the best results, and underlined numbers denote the second best results.
    下载: 导出CSV

    Table  3   Real Emerging Case in NELL-995 in Meta-Testing Set

    Dataset Entity Associated Tiplet
    Original KG babe_ruth /
    Emerging KG yankees (bernie_williams, athlete_plays_for_team)
    (carl_pavano, athlete_plays_for_team, yankees)
    (yankees, organization_hired_person, rodriguez)
    下载: 导出CSV

    Table  4   Results of Unseen-to-Unseen Triplets on FB15k-237 and NELL-995

    Model FB15k-237 NELL-995
    MRR Hit@1 Hit@3 Hit@10 MRR Hit@1 Hit@3 Hit@10
    GEN 0.130 0.099 0.130 0.173 0.036 0.008 0.024 0.057
    MMGCN 0.202 0.166 0.197 0.253 0.125 0.101 0.133 0.152
    下载: 导出CSV
  • [1]

    Bollacker K, Evans C, Paritosh P, Sturge T, Taylor J. Freebase: A collaboratively created graph database for structuring human knowledge. In Proc. the 2008 ACM SIGMOD International Conference on Management of Data, Jun. 2008, pp.1247–1250. DOI: 10.1145/1376616.1376746.

    [2]

    Carlson A, Betteridge J, Kisiel B, Settles B, Hruschka E, Mitchell T. Toward an architecture for never-ending language learning. In Proc. the 24th AAAI Conference on Artificial Intelligence, Jul. 2010, pp.1306–1313. DOI: 10.1609/AAAI.V24I1.7519.

    [3]

    Auer S, Bizer C, Kobilarov G, Lehmann J, Cyganiak R, Ives Z. DBpedia: A nucleus for a Web of open data. In Proc. the 6th International Semantic Web Conference, 2nd Asian Semantic Web Conference, Nov. 2007, pp.722–735. DOI: 10.1007/978-3-540-76298-0_52.

    [4]

    Wang Y X, Khan A, Wu T X, Jin J H, Yan H J. Semantic guided and response times bounded top-k similarity search over knowledge graphs. In Proc. the 36th IEEE International Conference on Data Engineering, Apr. 2020, pp.445–456. DOI: 10.1109/ICDE48307.2020.00045.

    [5]

    Yang Z X. Biomedical information retrieval incorporating knowledge graph for explainable precision medicine. In Proc. the 43rd International ACM SIGIR conference on research and development in Information Retrieval, Jul. 2020, p.2486. DOI: 10.1145/3397271.3401458.

    [6]

    Wong C M, Feng F, Zhang W, Vong C M, Chen H, Zhang Y C, He P, Chen H, Zhao K, Chen H J. Improving conversational recommender system by pretraining billion-scale knowledge graph. In Proc. the 37th IEEE International Conference on Data Engineering, Apr. 2021, pp.2607–2612. DOI: 10.1109/ICDE51399.2021.00291.

    [7]

    Deng Z Y, Li C Y, Liu S J, Ali W, Shao J. Knowledge-aware group representation learning for group recommendation. In Proc. the 37th IEEE International Conference on Data Engineering, Apr. 2021, pp.1571–1582. DOI: 10.1109/ICDE51399.2021.00139.

    [8]

    Hu S, Zou L, Yu J X, Wang H X, Zhao D Y. Answering natural language questions by subgraph matching over knowledge graphs (extended abstract). In Proc. the 34th IEEE International Conference on Data Engineering, Apr. 2018, pp.1815–1816. DOI: 10.1109/ICDE.2018.00265.

    [9]

    Kaiser M, Roy R S, Weikum G. Reinforcement learning from reformulations in conversational question answering over knowledge graphs. In Proc. the 44th International ACM SIGIR Conference on Research and Development in Information Retrieval, Jul. 2021, pp.459–469. DOI: 10.1145/3404835.3462859.

    [10]

    Nickel M, Murphy K, Tresp V, Gabrilovich E. A review of relational machine learning for knowledge graphs. Proceedings of the IEEE, 2016, 104(1): 11–33. DOI: 10.1109/JPROC.2015.2483592.

    [11]

    Bordes A, Usunier N, García-Durán A, Weston J, Yakhnenko O. Translating embeddings for modeling multi-relational data. In Proc. the 26th Annual Conference on Neural Information Processing Systems, Dec. 2013, pp.2787–2795.

    [12]

    Yang B S, Yih W T, He X D, Gao J F, Deng L. Embedding entities and relations for learning and inference in knowledge bases. In Proc. the 3rd International Conference on Learning Representations, May 2015.

    [13]

    Shi B X, Weninger T. Open-world knowledge graph completion. In Proc. the 32nd AAAI Conference on Artificial Intelligence, Feb. 2018, pp.1957–1964. DOI: 10.1609/aaai.v32i1.11535.

    [14]

    Hamaguchi T, Oiwa H, Shimbo M, Matsumoto Y. Knowledge transfer for out-of-knowledge-base entities: A graph neural network approach. In Proc. the 26th International Joint Conference on Artificial Intelligence, Aug. 2017, pp.1802–1808. DOI: 10.24963/ijcai.2017/250.

    [15]

    Wang P F, Han J L, Li C L, Pan R. Logic attention based neighborhood aggregation for inductive knowledge graph embedding. In Proc. the 33rd AAAI Conference on Artificial Intelligence, Jan. 27–Feb. 1, 2019, pp.7152–7159. DOI: 10.1609/aaai.v33i01.33017152.

    [16]

    Chen M Y, Zhang W, Zhang W, Chen Q, Chen H J. Meta relational learning for few-shot link prediction in knowledge graphs. In Proc. the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing, Nov. 2019, pp.4217–4226. DOI: 10.18653/V1/D19-1431.

    [17]

    He Z L, Chen P F, Li X Y, Wang Y F, Yu G B, Chen C L, Li X R, Zheng Z B. A spatiotemporal deep learning approach for unsupervised anomaly detection in cloud systems. IEEE Trans. Neural Networks and Learning Systems, 2023, 34(4): 1705–1719. DOI: 10.1109/TNNLS.2020.3027736.

    [18]

    Zhang Y F, Wang W Q, Chen W, Xu J J, Liu A, Zhao L. Meta-learning based hyper-relation feature modeling for out-of-knowledge-base embedding. In Proc. the 30th ACM International Conference on Information and Knowledge Management, Oct. 2021, pp.2637–2646. DOI: 10.1145/3459637.3482367.

    [19]

    Wang Z, Zhang J W, Feng J L, Chen Z. Knowledge graph embedding by translating on hyperplanes. In Proc. the 28th AAAI Conference on Artificial Intelligence, Jul. 2014, pp.1112–1119. DOI: 10.1609/AAAI.V28I1.8870.

    [20]

    Sun Z Q, Deng Z H, Nie J Y, Tang J. RotatE: Knowledge graph embedding by relational rotation in complex space. In Proc. the 7th International Conference on Learning Representations, May 2019.

    [21]

    Zhang Z Q, Cai J Y, Zhang Y D, Wang J. Learning hierarchy-aware knowledge graph embeddings for link prediction. In Proc. the 34th AAAI Conference on Artificial Intelligence, Feb. 2020, pp.3065–3072. DOI: 10.1609/aaai.v34i03.5701.

    [22]

    Nickel M, Tresp V, Kriegel H P. A three-way model for collective learning on multi-relational data. In Proc. the 28th International Conference on Machine Learning, Jun. 28–Jul. 1, 2011, pp.809–816.

    [23]

    Dettmers T, Minervini P, Stenetorp P, Riedel S. Convolutional 2D knowledge graph embeddings. In Proc. the 32nd AAAI Conference on Artificial Intelligence, Feb. 2018, pp.1811–1818. DOI: 10.1609/AAAI.V32I1.11573.

    [24]

    Nguyen D Q, Nguyen T D, Nguyen D Q, Phung D. A novel embedding model for knowledge base completion based on convolutional neural network. In Proc. the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Jun. 2018, pp.327–333. DOI: 10.18653/v1/n18-2053.

    [25]

    Schlichtkrull M, Kipf T N, Bloem P, van den Berg R, Titov I, Welling M. Modeling relational data with graph convolutional networks. In Proc. the 15th International Conference on Semantic Web, Jun. 2018, pp.593–607. DOI: 10.1007/978-3-319-93417-4_38.

    [26]

    Shang C, Tang Y, Huang J, Bi J B, He X D, Zhou B W. End-to-end structure-aware convolutional networks for knowledge base completion. In Proc. the 33rd AAAI Conference on Artificial Intelligence, Jan. 27–Feb. 1, 2019, pp.3060–3067. DOI: 10.1609/AAAI.V33I01.33013060.

    [27]

    Xiong W H, Yu M, Chang S Y, Guo X X, Wang W Y. One-shot relational learning for knowledge graphs. In Proc. the 2018 Conference on Empirical Methods in Natural Language Processing, Oct. 31–Nov. 7, 2018, pp.1980–1990. DOI: 10.18653/V1/D18-1223.

    [28]

    Zhang C X, Yao H X, Huang C, Jiang M, Li Z H, Chawla N V. Few-shot knowledge graph completion. In Proc. the 34th AAAI Conference on Artificial Intelligence, Feb. 2020, pp.3041–3048. DOI: 10.1609/AAAI.V34I03.5698.

    [29]

    Muggleton S. Inductive logic programming. New Generation Computing, 1991, 8(4): 295–318. DOI: 10.1007/BF03037089.

    [30]

    Yang F, Yang Z L, Cohen W W. Differentiable learning of logical rules for knowledge base reasoning. In Proc. the 31st Annual Conference on Neural Information Processing Systems, Dec. 2017, pp.2316–2325.

    [31]

    Cohen W W. TensorLog: A differentiable deductive database. arXiv: 1605.06523, 2016. https://arxiv.org/abs/1605.06523, Sept. 2024.

    [32]

    Sadeghian A, Armandpour M, Ding P, Wang D Z. DRUM: End-to-end differentiable rule mining on knowledge graphs. In Proc. the 33rd Annual Conference on Neural Information Processing Systems, Dec. 2019, Article No. 1375.

    [33]

    Qu M, Chen J K, Xhonneux L P A C, Bengio Y, Tang J. RNNlogic: Learning logic rules for reasoning on knowledge graphs. In Proc. the 9th International Conference on Learning Representations, May 2021.

    [34]

    Cheng K W, Liu J H, Wang W, Sun Y Z. RLogic: Recursive logical rule learning from knowledge graphs. In Proc. the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, Aug. 2022, pp.179–189. DOI: 10.1145/3534678.3539421.

    [35]

    Chen S Y, Fang H, Cai Y F, Huang X, Sun M M. Differentiable neuro-symbolic reasoning on large-scale knowledge graphs. In Proc. the 37th Annual Conference on Neural Information Processing Systems, Dec. 2023, Article No. 1222.

    [36]

    He Y Q, Wang Z H, Zhang P, Tu Z P, Ren Z C. VN network: Embedding newly emerging entities with virtual neighbors. In Proc. the 29th ACM International Conference on Information and Knowledge Management, Oct. 2020, pp.505–514. DOI: 10.1145/3340531.3411865.

    [37]

    Baek J, Lee D B, Hwang S J. Learning to extrapolate knowledge: Transductive few-shot out-of-graph link prediction. In Proc. the 34th Annual Conference on Neural Information Processing Systems, Dec. 2020, Article No. 47.

    [38]

    Wang H W, Ren H Y, Leskovec J. Relational message passing for knowledge graph completion. In Proc. the 27th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, Aug. 2021, pp.1697–1707. DOI: 10.1145/3447548.3467247.

    [39]

    Zhu Z C, Zhang Z B, Xhonneux L P, Tang J. Neural bellman-ford networks: A general graph neural network framework for link prediction. In Proc. the 35th Annual Conference on Neural Information Processing Systems, Dec. 2021, Article No. 2256.

    [40]

    Zhang Y Q, Yao Q M. Knowledge graph reasoning with relational digraph. In Proc. the 2022 ACM Web Conference, Apr. 2022, pp.912–924. DOI: 10.1145/3485447.3512008.

    [41]

    Wang C J, Zhou X F, Pan S R, Dong L H, Song Z L, Sha Y. Exploring relational semantics for inductive knowledge graph completion. In Proc. the 36th AAAI Conference on Artificial Intelligence, Feb. 22–Mar. 1, 2022, pp.4184–4192. DOI: 10.1609/AAAI.V36I4.20337.

    [42]

    Zhang Y Q, Zhou Z K, Yao Q M, Chu X W, Han B. AdaProp: Learning adaptive propagation for graph neural network based knowledge graph reasoning. In Proc. the 29th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, Aug. 2023, pp.3446–3457. DOI: 10.1145/3580305.3599404.

    [43]

    Lee J, Chung C, Whang J J. InGram: Inductive knowledge graph embedding via relation graphs. In Proc. the 40th International Conference on Machine Learning, Jul. 2023, pp.18796–18809.

    [44]

    Teru K K, Denis E G, Hamilton W L. Inductive relation prediction by subgraph reasoning. In Proc. the 37th International Conference on Machine Learning, Jul. 2020, pp.9448–9457.

    [45]

    Chen J J, He H R, Wu F, Wang J. Topology-aware correlations between relations for inductive link prediction in knowledge graphs. In Proc. the 35th AAAI Conference on Artificial Intelligence, Feb. 2021, pp.6271–6278. DOI: 10.1609/AAAI.V35I7.16779.

    [46]

    Liu S W, Grau B C, Horrocks I, Kostylev E V. INDIGO: GNN-based inductive knowledge graph completion using pair-wise encoding. In Proc. the 35th Annual Conference on Neural Information Processing Systems, Dec. 2021, Article No. 156.

    [47]

    Chen M Y, Zhang W, Zhu Y S, Zhou H T, Yuan Z G, Xu C L, Chen H J. Meta-knowledge transfer for inductive knowledge graph embedding. In Proc. the 45th International ACM SIGIR Conference on Research and Development in Information Retrieval, Jul. 2022, pp.927–937. DOI: 10.1145/3477495.3531757.

    [48]

    Xu X H, Zhang P, He Y Q, Chao C P, Yan C Y. Subgraph neighboring relations infomax for inductive link prediction on knowledge graphs. In Proc. the 31st International Joint Conference on Artificial Intelligence, Jul. 2022, pp.2341–2347. DOI: 10.24963/IJCAI.2022/325.

    [49]

    Zhang Y F, Wang W Q, Yin H Z, Zhao P P, Chen W, Zhao L. Disconnected emerging knowledge graph oriented inductive link prediction. In Proc. the 39th IEEE International Conference on Data Engineering, Apr. 2023, pp.381–393. DOI: 10.1109/ICDE55515.2023.00036.

    [50]

    Geng Y X, Chen J Y, Pan J Z, Chen M Y, Jiang S, Zhang W, Chen H J. Relational message passing for fully inductive knowledge graph completion. In Proc. the 39th IEEE International Conference on Data Engineering, Apr. 2023, pp.1221–1233. DOI: 10.1109/ICDE55515.2023.00098.

    [51]

    Toutanova K, Chen D Q. Observed versus latent features for knowledge base and text inference. In Proc. the 3rd Workshop on Continuous Vector Space Models and their Compositionality, Jul. 2015, pp.57–66. DOI: 10.18653/V1/W15-4007.

    [52]

    Xiong W H, Hoang T, Wang W Y. DeepPath: A reinforcement learning method for knowledge graph reasoning. In Proc. the 2017 Conference on Empirical Methods in Natural Language Processing, Sept. 2017, pp.564–573. DOI: 10.18653/v1/d17-1060.

    [53]

    Bordes A, Weston J, Collobert R, Bengio Y. Learning structured embeddings of knowledge bases. In Proc. the 25th AAAI Conference on Artificial Intelligence, Aug. 2011, pp.301–306. DOI: 10.1609/AAAI.V25I1.7917.

图(8)  /  表(6)
计量
  • 文章访问数:  304
  • HTML全文浏览量:  5
  • PDF下载量:  61
  • 被引次数: 0
出版历程
  • 收稿日期:  2022-11-05
  • 录用日期:  2024-04-23
  • 网络出版日期:  2024-07-21
  • 刊出日期:  2024-10-30

目录

    /

    返回文章
    返回