We use cookies to improve your experience with our site.

面向问题检索的层级自训练张量神经网络模型

A Tensor Neural Network with Layerwise Pretraining: Towards Effective Answer Retrieval

  • 摘要: 这篇文章研究了社区问答中的问题检索问题。提出了张量模型去更好地捕捉问题和答案之前的关联。问题和候选答案分别映射到不同的隐层语义空间,继而用一个三维张量去构建隐层语义空间中的问题答案关联。为了更好地初始化张量网络的各个层级,创新性地提出了张量降噪自动编码算法(denoising tensor autoencoder ,DTAE)。在词向量层使用降噪自动编码算法,而在张量层使用张量降噪自动编码算法,由此实现了对张量网络的层级敏感的自训练策略。实现结果表明,本文提出的张量网络模型超越了其他神经网络方法,本文提出的张量降噪自动编码算法显著提升了系统的性能和健壮性。

     

    Abstract: In this paper we address the answer retrieval problem in community-based question answering.To fully capture the interactions between question-answer pairs,we propose an original tensor neural network to model the relevance between them.The question and candidate answers are separately embedded into different latent semantic spaces,and a 3-way tensor is then utilized to model the interactions between latent semantics.To initialize the network layers properly,we propose a novel algorithm called denoising tensor autoencoder (DTAE),and then implement a layerwise pretraining strategy using denoising autoencoders (DAE) on word embedding layers and DTAE on the tensor layer.The experimental results show that our tensor neural network outperforms various baselines with other competitive neural network methods,and our pretraining DTAE strategy improves the system's performance and robustness.

     

/

返回文章
返回