›› 2018, Vol. 33 ›› Issue (2): 323-334.doi: 10.1007/s11390-018-1821-8

Special Issue: Artificial Intelligence and Pattern Recognition; Data Management and Data Mining

• Artificial Intelligence and Pattern Recognition • Previous Articles     Next Articles

Modeling the Correlations of Relations for Knowledge Graph Embedding

Ji-Zhao Zhu1,2, Yan-Tao Jia2, Member, CCF, ACM, Jun Xu2*, Member, CCF, ACM, IEEE, Jian-Zhong Qiao1*, Senior Member, CCF, Xue-Qi Cheng2, Fellow, CCF, Member, ACM, IEEE   

  1. 1 College of Computer Science and Engineering, Northeastern University, Shenyang 110169, China;
    2 Key Laboratory of Network Data Science and Technology, Institute of Computing Technology Chinese Academy of Sciences, Beijing 110190, China
  • Received:2017-01-20 Revised:2017-08-24 Online:2018-03-05 Published:2018-03-05
  • Contact: Jun Xu, Jian-Zhong Qiao E-mail:junxu@ict.ac.cn;qiaojianzhong@mail.neu.edu.cn
  • About author:Ji-Zhao Zhu is now a Ph.D. candidate in the College of Computer Science and Engineering, Northeastern University, Shenyang. His research interests include knowledge graph, representation learning and parallel computation
  • Supported by:

    This work was supported by the National Basic Research 973 Program of China under Grant No. 2014CB340405, the National Key Research and Development Program of China under Grant No. 2016YFB1000902, and the National Natural Science Foundation of China under Grant Nos. 61402442, 61272177, 61173008, 61232010, 61303244, 61572469, 91646120 and 61572473.

Knowledge graph embedding, which maps the entities and relations into low-dimensional vector spaces, has demonstrated its effectiveness in many tasks such as link prediction and relation extraction. Typical methods include TransE, TransH, and TransR. All these methods map different relations into the vector space separately and the intrinsic correlations of these relations are ignored. It is obvious that there exist some correlations among relations because different relations may connect to a common entity. For example, the triples (Steve Jobs, PlaceOfBrith, California) and (Apple Inc., Location, California) share the same entity California as their tail entity. We analyze the embedded relation matrices learned by TransE/TransH/TransR, and find that the correlations of relations do exist and they are showed as low-rank structure over the embedded relation matrix. It is natural to ask whether we can leverage these correlations to learn better embeddings for the entities and relations in a knowledge graph. In this paper, we propose to learn the embedded relation matrix by decomposing it as a product of two low-dimensional matrices, for characterizing the low-rank structure. The proposed method, called TransCoRe (Translation-Based Method via Modeling the Correlations of Relations), learns the embeddings of entities and relations with translation-based framework. Experimental results based on the benchmark datasets of WordNet and Freebase demonstrate that our method outperforms the typical baselines on link prediction and triple classification tasks.

[1] Miller G A. WordNet:A lexical database for English. Communications of the ACM, 1995, 38(11):39-41.

[2] Bollacker K, Cook R, Tufts P. Freebase:A shared database of structured general human knowledge. In Proc. the 22nd National Conf. Artificial Intelligence, July 2007, pp.1962-1963.

[3] Bollacker K, Evans C, Paritosh P, Sturge T, Taylor J. Freebase:A collaboratively created graph database for structuring human knowledge. In Proc. ACM SIGMOD Int. Conf. Management of Data, June 2008, pp.1247-1250.

[4] Suchanek F M, Kasneci G, Weikum G. YAGO:A core of semantic knowledge unifying WordNet and Wikipedia. In Proc. the 16th Int. World Wide Web Conf., May 2007, pp.697-706.

[5] Tang J, Lou T C, Kleinberg J, Wu S. Transfer learning to infer social ties across heterogeneous networks. ACM Trans. Information Systems, 2016, 34(2):Article No. 7.

[6] Jia Y T, Wang Y Z, Lin H L, Jin X L, Cheng X Q. Locally adaptive translation for knowledge graph embedding. In Proc. the 30th AAAI Conf. Artificial Intelligence, February 2016, pp.992-998.

[7] Wu W T, Li H S, Wang H X, Zhu K Q. Probase:A probabilistic taxonomy for text understanding. In Proc. the ACM Int. Conf. Management of Data, May 2012, pp.481-492.

[8] Jayaram N, Khan A, Li C K, Yan X F, Elmasri R. Querying knowledge graphs by example entity tuples. IEEE Trans. Knowledge and Data Engineering, 2015, 27(10):2797-2811.

[9] Bordes A, Usunier N, Garcia-Durán A, Weston J, Yakhnenko O. Translating embeddings for modeling multirelational data. In Proc. the 26th Int. Conf. Neural Information Processing Systems, December 2013, pp.2787-2795.

[10] Wang Z, Zhang J W, Feng J L, Chen Z. Knowledge graph embedding by translating on hyperplanes. In Proc. the 28th AAAI Conf. Artificial Intelligence, July 2014, pp.1112-1119.

[11] Lin Y K, Liu Z Y, Sun M S, Liu Y, Zhu X. Learning entity and relation embeddings for knowledge graph completion. In Proc. the 29th AAAI Conf. Artificial Intelligence, January 2015.

[12] Alter O, Brown P O, Botstein D. Singular value decomposition for genome-wide expression data processing and modeling. Proceedings of the National Academy of Sciences of the United States of America, 2000, 97(18):10101-10106.

[13] de Lathauwer L, de Moor B, Vandewalle J. A multilinear singular value decomposition. SIAM Journal on Matrix Analysis and Applications, 2000, 21(4):1253-1278.

[14] Mikolov T, Sutskever I, Chen K, Corrado G, Dean J. Distributed representations of words and phrases and their compositionality. In Proc. the 26th Int. Conf. Neural Information Processing Systems, December 2013, pp.3111-3119.

[15] Nickel M, Tresp V, Kriegel H P. Factorizing YAGO:Scalable machine learning for linked data. In Proc. the 21st Int. Conf. World Wide Web, April 2012, pp.271-280.

[16] Franz T, Schultz A, Sizov S, Staab S. TripleRank:Ranking semantic Web data by tensor decomposition. In Proc. the 8th Int. Semantic Web Conf., October 2009, pp.213-228.

[17] Chang K W, Yih W T, Yang B S, Meek C. Typed tensor decomposition of knowledge bases for relation extraction. In Proc. Conf. Empirical Methods in Natural Language Processing, October 2014, pp.1568-1579.

[18] Chang K W, Yih W T, Meek C. Multi-relational latent semantic analysis. In Proc. Conf. Empirical Methods in Natural Language Processing, October 2013, pp.1602-1612.

[19] Kiers H A L. Towards a standardized notation and terminology in multiway analysis. Journal of Chemometrics, 2000, 14(3):105-122.

[20] Bordes A, Glorot X, Weston J, Bengio Y. Joint learning of words and meaning representations for open-text semantic parsing. In Proc. the 15th Int. Conf. Artificial Intelligence and Statistics, April 2012, pp.127-135.

[21] Bordes A, Glorot X, Weston J, Bengio Y. A semantic matching energy function for learning with multi-relational data. Machine Learning, 2014, 94(2):233-259.

[22] Bordes A, Weston J, Collobert R, Bengio Y. Learning structured embeddings of knowledge bases. In Proc. the 25th Int. Conf. Artificial Intelligence, August 2011, pp.301-306.

[23] Jenatton R, Le Roux N, Bordes A, Obozinski G. A latent factor model for highly multi-relational data. In Proc. the 25th Int. Conf. Neural Information Processing Systems, December 2012, pp.3167-3175.

[24] Socher R, Chen D Q, Manning C D, Ng A. Reasoning with neural tensor networks for knowledge base completion. In Proc. the 26th Int. Conf. Neural Information Processing Systems, December 2013, pp.926-934.

[25] Pearson K. Note on regression and inheritance in the case of two parents. Proceedings of the Royal Society of London, 1895, 58(347/348/349/350/351/352):240-242.

[26] Ji G L, He S Z, Xu L H, Liu K, Zhao J. Knowledge graph embedding via dynamic mapping matrix. In Proc. the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th Int. Joint Conf. Natural Language Processing, July 2015, pp.687-696.

[27] Lin Y K, Liu Z Y, Luan H B, Sun M S, Rao S W, Liu S. Modeling relation paths for representation learning of knowledge bases. In Proc. Conf. Empirical Methods in Natural Language Processing, September 2015, pp.705-714.
No related articles found!
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
[1] Liu Mingye; Hong Enyu;. Some Covering Problems and Their Solutions in Automatic Logic Synthesis Systems[J]. , 1986, 1(2): 83 -92 .
[2] Chen Shihua;. On the Structure of (Weak) Inverses of an (Weakly) Invertible Finite Automaton[J]. , 1986, 1(3): 92 -100 .
[3] Gao Qingshi; Zhang Xiang; Yang Shufan; Chen Shuqing;. Vector Computer 757[J]. , 1986, 1(3): 1 -14 .
[4] Chen Zhaoxiong; Gao Qingshi;. A Substitution Based Model for the Implementation of PROLOG——The Design and Implementation of LPROLOG[J]. , 1986, 1(4): 17 -26 .
[5] Huang Heyan;. A Parallel Implementation Model of HPARLOG[J]. , 1986, 1(4): 27 -38 .
[6] Min Yinghua; Han Zhide;. A Built-in Test Pattern Generator[J]. , 1986, 1(4): 62 -74 .
[7] Tang Tonggao; Zhao Zhaokeng;. Stack Method in Program Semantics[J]. , 1987, 2(1): 51 -63 .
[8] Min Yinghua;. Easy Test Generation PLAs[J]. , 1987, 2(1): 72 -80 .
[9] Zhu Hong;. Some Mathematical Properties of the Functional Programming Language FP[J]. , 1987, 2(3): 202 -216 .
[10] Li Minghui;. CAD System of Microprogrammed Digital Systems[J]. , 1987, 2(3): 226 -235 .

ISSN 1000-9000(Print)

         1860-4749(Online)
CN 11-2296/TP

Home
Editorial Board
Author Guidelines
Subscription
Journal of Computer Science and Technology
Institute of Computing Technology, Chinese Academy of Sciences
P.O. Box 2704, Beijing 100190 P.R. China
Tel.:86-10-62610746
E-mail: jcst@ict.ac.cn
 
  Copyright ©2015 JCST, All Rights Reserved