›› 2017, Vol. 32 ›› Issue (4): 805-813.doi: 10.1007/s11390-017-1761-8

Special Issue: Artificial Intelligence and Pattern Recognition

• Special Issue on Deep Learning • Previous Articles     Next Articles

Type-Aware Question Answering over Knowledge Base with Attention-Based Tree-Structured Neural Networks

Jun Yin1, Wayne Xin Zhao2,3, Member, CCF, Xiao-Ming Li1,*, Fellow, CCF   

  1. 1 School of Electronic Engineering and Computer Science, Peking University, Beijing 100871, China;
    2 School of Information, Renmin University of China, Beijing 100872, China;
    3 Guangdong Key Laboratory of Big Data Analysis and Processing, Guangzhou 510006, China
  • Received:2016-12-20 Revised:2017-05-09 Online:2017-07-05 Published:2017-07-05
  • Contact: 10.1007/s11390-017-1761-8 E-mail:lxm@pku.edu.cn
  • Supported by:

    This work has been partially supported by the National Basic Research 973 Program of China under Grant No. 2014CB340405 and the National Natural Science Foundation of China under Grant No. U1536201. Wayne Xin Zhao was partially supported by Beijing Natural Science Foundation under Grant No. 4162032, and the Opening Project of Guangdong Province Key Laboratory of Big Data Analysis and Processing under Grant No. 2017001.

Question answering (QA) over knowledge base (KB) aims to provide a structured answer from a knowledge base to a natural language question. In this task, a key step is how to represent and understand the natural language query. In this paper, we propose to use tree-structured neural networks constructed based on the constituency tree to model natural language queries. We identify an interesting observation in the constituency tree:different constituents have their own semantic characteristics and might be suitable to solve different subtasks in a QA system. Based on this point, we incorporate the type information as an auxiliary supervision signal to improve the QA performance. We call our approach type-aware QA. We jointly characterize both the answer and its answer type in a unified neural network model with the attention mechanism. Instead of simply using the root representation, we represent the query by combining the representations of different constituents using task-specific attention weights. Extensive experiments on public datasets have demonstrated the effectiveness of our proposed model. More specially, the learned attention weights are quite useful in understanding the query. The produced representations for intermediate nodes can be used for analyzing the effectiveness of components in a QA system.

[1] Yao X C, van Durme B. Information extraction over structured data: Question answering with Freebase. In Proc. the 52nd Annual Meeting of the Association for Computational Linguistics, June 2014, pp.956-966.

[2] Yao X C. Lean question answering over Freebase from scratch. In Proc. North American Chapter of the Association for Computational Linguistics, May 31-June 5, 2015, pp.66-70.

[3] Berant J, Chou A, Frostig R, Liang P. Semantic parsing on Freebase from question-answer pairs. In Proc. the Conf. Empirical Methods in Natural Language Processing, Volume 2, Oct. 2013.

[4] Fader A, Zettlemoyer L, Etzioni O. Paraphrase-driven learning for open question answering. In Proc. the 51st Annual Meeting of the Association for Computational Linguistics Aug. 2013, pp.1608-1618.

[5] Berant J, Liang P. Semantic parsing via paraphrasing. In Proc. the 52nd Annual Meeting of the Association for Computational Linguistics, June 2014, pp.1415-1425.

[6] Yih W T, He X D, Meek C. Semantic parsing for singlerelation question answering. In Proc. the 52nd Annual Meeting of the Association for Computational Linguistics, June 2014, pp.643-648.

[7] Yih W T, Chang M W, He X D, Gao J F. Semantic parsing via staged query graph generation: Question answering with knowledge base. In Proc. the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th Int. Joint Conf. Natural Language Processing, July 2015, pp.1321-1331.

[8] Bordes A, Chopra S, Weston J. Question answering with subgraph embeddings. arXiv: 1406.3676, 2014. https://arxiv.org/abs/1406.3676, May 2017.

[9] Bordes A, Weston J, Usunier N. Open question answering with weakly supervised embedding models. In Joint European Conf. Machine Learning and Knowledge Discovery in Databases, Calders T, Esposito F, Hüllermeier E, Meo R (eds.), Springer, 2014, pp.165-180.

[10] Bordes A, Usunier N, Chopra S, Weston J. Large-scale simple question answering with memory networks. arXiv: 1506.02075, 2015. https://arxiv.org/abs/1506.02075, May 2017.

[11] Zhang Y Z, Liu K, He S Z, Ji G L, Liu Z Y, Wu H, Zhao J. Question answering over knowledge base with neural attention combining global knowledge information. arXiv: 1606.00979, 2016. http://arxiv.org/abs/1606.00979, May 2017.

[12] Golub D, He X D. Character-level question answering with attention. arXiv: 1604.00727, 2016. http://lanl.arxiv. org/abs/1604.00727, May 2017.

[13] Iyyer M, Boyd-Graber J, Claudino L, Socher R, Daumé Ⅲ H. A neural network for factoid question answering over paragraphs. In Proc. the 2014 Conf. Empirical Methods in Natural Language Processing, Oct. 2014, pp.633-644.

[14] Mou L L, Peng H, Li G, Xu Y, Zhang L, Jin Z. Discriminative neural sentence modeling by tree-based convolution. arXiv: 1504.01106, 2015. https://arxiv.org/abs/15- 04.01106, May 2017.

[15] Tai K S, Socher R, Manning C D. Improved semantic representations from tree-structured long short-term memory networks. arXiv: 1503.00075, 2015. http://arxiv.org/abs/ 1503.00075, May 2017.

[16] Bast H, Haussmann E. More accurate question answering on Freebase. In Proc. the 24th ACM Int. Conf. Information and Knowledge Management, Oct. 2015, pp.1431-1440.

[17] Weston J, Chopra S, Bordes A. Memory networks. In Proc. Int. Conf. Learning Representations (ICLR), May 2015.

[18] Sukhbaatar S, Szlam A, Weston J, Fergus R. End-to-end memory networks. In Proc. Advances in Neural Information Processing Systems, Nov. 2015, pp.2431-2439.

[19] Hu B T, Lu Z D, Li H, Chen Q C. Convolutional neural network architectures for matching natural language sentences. In Proc. the 27th Int. Conf. Neural Information Processing Systems, Dec. 2014, pp.2042-2050.

[20] Dong L, Wei F R, Zhou M, Xu K. Question answering over Freebase with multi-column convolutional neural networks. In Proc. the 53rd Annual Meeting of the Association for Computational Linguistics, July 2015, pp.260-269.

[21] Yin W P, Yu M, Xiang B, Zhou B W, Schütze H. Simple question answering by attentive convolutional neural network. arXiv: 1606.03391, 2016. http://arxiv.org/abs/16- 06.03391, May 2017.

[22] Dai Z H, Li L, Xu W. CFO: Conditional focused neural question answering with large-scale knowledge bases. arXiv: 1606.01994, 2016. https://www.arxiv.org/abs/1606.01994, May 2017.

[23] Socher R, Huang E H, Pennin J, Manning C D, Ng A Y. Dynamic pooling and unfolding recursive autoencoders for paraphrase detection. In Proc. Advances in Neural Information Processing Systems, Dec. 2011, pp.801-809.

[24] Socher R, Perelygin A, Wu J Y, Chuang J, Manning C D, Ng A Y, Potts C. Recursive deep models for semantic compositionality over a sentiment treebank. In Proc. the Conf. Empirical Methods in Natural Language Processing, Oct. 2013, pp.1631-1642.

[25] Bollacker K, Evans C, Paritosh P, Sturge T, Taylor J. Freebase: A collaboratively created graph database for structuring human knowledge. In Proc. ACM SIGMOD Int. Conf. Management of Data, June 2008, pp.1247-1250.

[26] Bordes A, Usunier N, Garcia-Durán A, Weston J, Yakhnenko O. Translating embeddings for modeling multirelational data. In Proc. Advances in Neural Information Processing Systems, Dec. 2013, pp.2787-2795.
No related articles found!
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
[1] Liu Mingye; Hong Enyu;. Some Covering Problems and Their Solutions in Automatic Logic Synthesis Systems[J]. , 1986, 1(2): 83 -92 .
[2] Chen Shihua;. On the Structure of (Weak) Inverses of an (Weakly) Invertible Finite Automaton[J]. , 1986, 1(3): 92 -100 .
[3] Gao Qingshi; Zhang Xiang; Yang Shufan; Chen Shuqing;. Vector Computer 757[J]. , 1986, 1(3): 1 -14 .
[4] Chen Zhaoxiong; Gao Qingshi;. A Substitution Based Model for the Implementation of PROLOG——The Design and Implementation of LPROLOG[J]. , 1986, 1(4): 17 -26 .
[5] Huang Heyan;. A Parallel Implementation Model of HPARLOG[J]. , 1986, 1(4): 27 -38 .
[6] Min Yinghua; Han Zhide;. A Built-in Test Pattern Generator[J]. , 1986, 1(4): 62 -74 .
[7] Tang Tonggao; Zhao Zhaokeng;. Stack Method in Program Semantics[J]. , 1987, 2(1): 51 -63 .
[8] Min Yinghua;. Easy Test Generation PLAs[J]. , 1987, 2(1): 72 -80 .
[9] Zhang Bo; Zhang Ling;. Statistical Heuristic Search[J]. , 1987, 2(1): 1 -11 .
[10] Zhu Hong;. Some Mathematical Properties of the Functional Programming Language FP[J]. , 1987, 2(3): 202 -216 .

ISSN 1000-9000(Print)

         1860-4749(Online)
CN 11-2296/TP

Home
Editorial Board
Author Guidelines
Subscription
Journal of Computer Science and Technology
Institute of Computing Technology, Chinese Academy of Sciences
P.O. Box 2704, Beijing 100190 P.R. China
Tel.:86-10-62610746
E-mail: jcst@ict.ac.cn
 
  Copyright ©2015 JCST, All Rights Reserved