We use cookies to improve your experience with our site.
Fei Tian, Bin Gao, En-Hong Chen, Tie-Yan Liu. Learning Better Word Embedding by Asymmetric Low-Rank Projection of Knowledge Graph[J]. Journal of Computer Science and Technology, 2016, 31(3): 624-634. DOI: 10.1007/s11390-016-1651-5
Citation: Fei Tian, Bin Gao, En-Hong Chen, Tie-Yan Liu. Learning Better Word Embedding by Asymmetric Low-Rank Projection of Knowledge Graph[J]. Journal of Computer Science and Technology, 2016, 31(3): 624-634. DOI: 10.1007/s11390-016-1651-5

Learning Better Word Embedding by Asymmetric Low-Rank Projection of Knowledge Graph

  • Word embedding, which refers to low-dimensional dense vector representations of natural words, has demonstrated its power in many natural language processing tasks. However, it may suffer from the inaccurate and incomplete information contained in the free text corpus as training data. To tackle this challenge, there have been quite a few studies that leverage knowledge graphs as an additional information source to improve the quality of word embedding. Although these studies have achieved certain success, they have neglected some important facts about knowledge graphs: 1) many relationships in knowledge graphs are many-to-one, one-to-many or even many-to-many, rather than simply one-to-one; 2) most head entities and tail entities in knowledge graphs come from very different semantic spaces. To address these issues, in this paper, we propose a new algorithm named ProjectNet. ProjectNet models the relationships between head and tail entities after transforming them with different low-rank projection matrices. The low-rank projection can allow non oneto-one relationships between entities, while different projection matrices for head and tail entities allow them to originate in different semantic spaces. The experimental results demonstrate that ProjectNet yields more accurate word embedding than previous studies, and thus leads to clear improvements in various natural language processing tasks.
  • loading

Catalog

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return