We use cookies to improve your experience with our site.

基于注意力机制的树状神经网络知识库问答模型

Type-Aware Question Answering over Knowledge Base with Attention-Based Tree-Structured Neural Networks

  • 摘要: 基于知识库的问答旨在对输入的自然语言问题从知识库找出结构化的答案。该任务的关键步骤是如何表示和理解自然语言问题。本文我们提出使用树状神经网络基于语法构造数对自然语言问题建模。我们发现自然语言问题语法构造树中的不同成分具有自己的语义特征,对应解决知识库问答系统中不同的子任务。基于这点发现,我们将实体类型信息考虑进来作为辅助监督信号,提升知识库问答系统效果。我们提出的方法叫实体类型感知的问答系统(type-aware QA)。基于注意力机制,我们在一个统一的神经网络模型中同时刻画答案实体和实体类型。对于问题表示,我们联合不同语法构造树组件表示和面向不同子任务的注意力权重,而不是简单使用语法构造树根节点表示。公开数据集上的实验结果表明我们提出的模型对知识库问答效果有提升。更具体的,学习到的注意力权重对于理解问题意义重大,过程中产生的中间节点表示可用于分析其在问答系统中的不同子任务的作用和效果。

     

    Abstract: Question answering (QA) over knowledge base (KB) aims to provide a structured answer from a knowledge base to a natural language question. In this task, a key step is how to represent and understand the natural language query. In this paper, we propose to use tree-structured neural networks constructed based on the constituency tree to model natural language queries. We identify an interesting observation in the constituency tree:different constituents have their own semantic characteristics and might be suitable to solve different subtasks in a QA system. Based on this point, we incorporate the type information as an auxiliary supervision signal to improve the QA performance. We call our approach type-aware QA. We jointly characterize both the answer and its answer type in a unified neural network model with the attention mechanism. Instead of simply using the root representation, we represent the query by combining the representations of different constituents using task-specific attention weights. Extensive experiments on public datasets have demonstrated the effectiveness of our proposed model. More specially, the learned attention weights are quite useful in understanding the query. The produced representations for intermediate nodes can be used for analyzing the effectiveness of components in a QA system.

     

/

返回文章
返回