? Type-Aware Question Answering over Knowledge Base with Attention-Based Tree-Structured Neural Networks
Journal of Computer Science and Technology
Quick Search in JCST
 Advanced Search 
      Home | PrePrint | SiteMap | Contact Us | FAQ
 
Indexed by   SCIE, EI ...
Bimonthly    Since 1986
Journal of Computer Science and Technology 2017, Vol. 32 Issue (4) :805-813    DOI: 10.1007/s11390-017-1761-8
Special Issue on Deep Learning Current Issue | Archive | Adv Search << Previous Articles | Next Articles >>
Type-Aware Question Answering over Knowledge Base with Attention-Based Tree-Structured Neural Networks
Jun Yin1, Wayne Xin Zhao2,3, Member, CCF, Xiao-Ming Li1,*, Fellow, CCF
1 School of Electronic Engineering and Computer Science, Peking University, Beijing 100871, China;
2 School of Information, Renmin University of China, Beijing 100872, China;
3 Guangdong Key Laboratory of Big Data Analysis and Processing, Guangzhou 510006, China

Abstract
Reference
Related Articles
Download: [PDF 683KB]     Export: BibTeX or EndNote (RIS)  
Abstract Question answering (QA) over knowledge base (KB) aims to provide a structured answer from a knowledge base to a natural language question. In this task, a key step is how to represent and understand the natural language query. In this paper, we propose to use tree-structured neural networks constructed based on the constituency tree to model natural language queries. We identify an interesting observation in the constituency tree:different constituents have their own semantic characteristics and might be suitable to solve different subtasks in a QA system. Based on this point, we incorporate the type information as an auxiliary supervision signal to improve the QA performance. We call our approach type-aware QA. We jointly characterize both the answer and its answer type in a unified neural network model with the attention mechanism. Instead of simply using the root representation, we represent the query by combining the representations of different constituents using task-specific attention weights. Extensive experiments on public datasets have demonstrated the effectiveness of our proposed model. More specially, the learned attention weights are quite useful in understanding the query. The produced representations for intermediate nodes can be used for analyzing the effectiveness of components in a QA system.
Articles by authors
Keywordsquestion answering   deep neural network   knowledge base     
Received 2016-12-20;
Fund:

This work has been partially supported by the National Basic Research 973 Program of China under Grant No. 2014CB340405 and the National Natural Science Foundation of China under Grant No. U1536201. Wayne Xin Zhao was partially supported by Beijing Natural Science Foundation under Grant No. 4162032, and the Opening Project of Guangdong Province Key Laboratory of Big Data Analysis and Processing under Grant No. 2017001.

Corresponding Authors: 10.1007/s11390-017-1761-8     Email: lxm@pku.edu.cn
About author:
Cite this article:   
Jun Yin, Wayne Xin Zhao, Xiao-Ming Li.Type-Aware Question Answering over Knowledge Base with Attention-Based Tree-Structured Neural Networks[J]  Journal of Computer Science and Technology, 2017,V32(4): 805-813
URL:  
http://jcst.ict.ac.cn:8080/jcst/EN/10.1007/s11390-017-1761-8
Copyright 2010 by Journal of Computer Science and Technology