We use cookies to improve your experience with our site.

一种利用关系相关性的知识图谱表示方法

Modeling the Correlations of Relations for Knowledge Graph Embedding

  • 摘要: 知识图谱表示是将实体和关系映射到低维向量空间中,把实体和关系分别表示成低维的向量形式。在链接预测、关系抽取等诸多任务中的应用验证了该类方法的高效性。其中,最具代表性的知识图谱表示方法有:TransE,TransH和TransR。以上这些方法把关系各自独立的映射到向量空间,而忽略了他们之间的内在相关性。通常情况下,不同的关系可能与同一实体连接,例如:三元组(Steve Jobs,PlaceOfBrith,California)和(Apple Inc.,Location,California)共用同一个实体California,这暗示了关系间存在着一定的相关性。我们首先使用TransE,TransH和TransR分别在知识图谱FB15K上学习得到关系表示矩阵,再通过对关系表示矩阵进行分析,结果表明了该矩阵呈现出低秩结构,进而证明了关系间相关性的存在。那么,能否利用关系间的相关性学习得到更好的实体和关系的向量表达呢?本文提出了一种通过矩阵分解将关系表达矩阵转化为两个低维矩阵的方式进行学习,从而明确的刻画这种低秩结构。所提出的方法被称作TransCoRe,是一种基于翻译框架学习实体和关系表达的方法。在基准测试数据集WordNet和Freebase上的实验表明,本文所提方法在链接预测和三元组分类任务中与已有的代表性方法相比,取得了较好的效果。

     

    Abstract: Knowledge graph embedding, which maps the entities and relations into low-dimensional vector spaces, has demonstrated its effectiveness in many tasks such as link prediction and relation extraction. Typical methods include TransE, TransH, and TransR. All these methods map different relations into the vector space separately and the intrinsic correlations of these relations are ignored. It is obvious that there exist some correlations among relations because different relations may connect to a common entity. For example, the triples (Steve Jobs, PlaceOfBrith, California) and (Apple Inc., Location, California) share the same entity California as their tail entity. We analyze the embedded relation matrices learned by TransE/TransH/TransR, and find that the correlations of relations do exist and they are showed as low-rank structure over the embedded relation matrix. It is natural to ask whether we can leverage these correlations to learn better embeddings for the entities and relations in a knowledge graph. In this paper, we propose to learn the embedded relation matrix by decomposing it as a product of two low-dimensional matrices, for characterizing the low-rank structure. The proposed method, called TransCoRe (Translation-Based Method via Modeling the Correlations of Relations), learns the embeddings of entities and relations with translation-based framework. Experimental results based on the benchmark datasets of WordNet and Freebase demonstrate that our method outperforms the typical baselines on link prediction and triple classification tasks.

     

/

返回文章
返回