? Modeling the Correlations of Relations for Knowledge Graph Embedding
Journal of Computer Science and Technology
Quick Search in JCST
 Advanced Search 
      Home | PrePrint | SiteMap | Contact Us | FAQ
 
Indexed by   SCIE, EI ...
Bimonthly    Since 1986
Journal of Computer Science and Technology 2018, Vol. 33 Issue (2) :323-334    DOI: 10.1007/s11390-018-1821-8
Artificial Intelligence and Pattern Recognition Current Issue | Archive | Adv Search << Previous Articles | Next Articles >>
Modeling the Correlations of Relations for Knowledge Graph Embedding
Ji-Zhao Zhu1,2, Yan-Tao Jia2, Member, CCF, ACM, Jun Xu2*, Member, CCF, ACM, IEEE, Jian-Zhong Qiao1*, Senior Member, CCF, Xue-Qi Cheng2, Fellow, CCF, Member, ACM, IEEE
1 College of Computer Science and Engineering, Northeastern University, Shenyang 110169, China;
2 Key Laboratory of Network Data Science and Technology, Institute of Computing Technology Chinese Academy of Sciences, Beijing 110190, China

Abstract
Reference
Related Articles
Download: [PDF 417KB]     Export: BibTeX or EndNote (RIS)  
Abstract Knowledge graph embedding, which maps the entities and relations into low-dimensional vector spaces, has demonstrated its effectiveness in many tasks such as link prediction and relation extraction. Typical methods include TransE, TransH, and TransR. All these methods map different relations into the vector space separately and the intrinsic correlations of these relations are ignored. It is obvious that there exist some correlations among relations because different relations may connect to a common entity. For example, the triples (Steve Jobs, PlaceOfBrith, California) and (Apple Inc., Location, California) share the same entity California as their tail entity. We analyze the embedded relation matrices learned by TransE/TransH/TransR, and find that the correlations of relations do exist and they are showed as low-rank structure over the embedded relation matrix. It is natural to ask whether we can leverage these correlations to learn better embeddings for the entities and relations in a knowledge graph. In this paper, we propose to learn the embedded relation matrix by decomposing it as a product of two low-dimensional matrices, for characterizing the low-rank structure. The proposed method, called TransCoRe (Translation-Based Method via Modeling the Correlations of Relations), learns the embeddings of entities and relations with translation-based framework. Experimental results based on the benchmark datasets of WordNet and Freebase demonstrate that our method outperforms the typical baselines on link prediction and triple classification tasks.
Articles by authors
Keywordsknowledge graph embedding   low-rank   matrix decomposition     
Received 2017-01-20;
Fund:

This work was supported by the National Basic Research 973 Program of China under Grant No. 2014CB340405, the National Key Research and Development Program of China under Grant No. 2016YFB1000902, and the National Natural Science Foundation of China under Grant Nos. 61402442, 61272177, 61173008, 61232010, 61303244, 61572469, 91646120 and 61572473.

Corresponding Authors: Jun Xu, Jian-Zhong Qiao     Email: junxu@ict.ac.cn;qiaojianzhong@mail.neu.edu.cn
About author: Ji-Zhao Zhu is now a Ph.D. candidate in the College of Computer Science and Engineering, Northeastern University, Shenyang. His research interests include knowledge graph, representation learning and parallel computation
Cite this article:   
Ji-Zhao Zhu, Yan-Tao Jia, Jun Xu, Jian-Zhong Qiao, Xue-Qi Cheng.Modeling the Correlations of Relations for Knowledge Graph Embedding[J]  Journal of Computer Science and Technology, 2018,V33(2): 323-334
URL:  
http://jcst.ict.ac.cn:8080/jcst/EN/10.1007/s11390-018-1821-8
Copyright 2010 by Journal of Computer Science and Technology