Recurrent Neural Network Encoder with Attention for Community Question Answering
نویسندگان
چکیده
We apply a general recurrent neural network (RNN) encoder framework to community question answering (cQA) tasks. Our approach does not rely on any linguistic processing, and can be applied to different languages or domains. Further improvements are observed when we extend the RNN encoders with a neural attention mechanism that encourages reasoning over entire sequences. To deal with practical issues such as data sparsity and imbalanced labels, we apply various techniques such as transfer learning and multitask learning. Our experiments on the SemEval-2016 cQA task show 10% improvement on a MAP score compared to an information retrieval-based approach, and achieve comparable performance to a strong handcrafted feature-based method.
منابع مشابه
Decoding Coattention Encodings for Question Answering
An encoder-decoder architecture with recurrent neural networks in both the encoder and decoder is a standard approach to the question-answering problem (finding answers to a given question in a piece of text). The Dynamic Coattention[1] encoder is a highly effective encoder for the problem; we evaluated the effectiveness of different decoder when paired with the Dynamic Coattention encoder. We ...
متن کاملF Ast and a Ccurate R Eading C Omprehension by C Ombining S Elf - a Ttention and C Onvolution
Current end-to-end machine reading and question answering (Q&A) models are primarily based on recurrent neural networks (RNNs) with attention. Despite their success, these models are often slow for both training and inference due to the sequential nature of RNNs. We propose a new Q&A architecture that does not require recurrent networks: Its encoder consists exclusively of convolution and self-...
متن کاملOmbining S Elf - a Ttention and C Onvolution
Current end-to-end machine reading and question answering (Q&A) models are primarily based on recurrent neural networks (RNNs) with attention. Despite their success, these models are often slow for both training and inference due to the sequential nature of RNNs. We propose a new Q&A architecture that does not require recurrent networks: Its encoder consists exclusively of convolution and self-...
متن کاملQuestion Answering System using Dynamic Coattention Networks
We tackle the difficult problem of building a question answering system by building an end-to-end recurrent neural using network sequence-to-sequence model. We use the coattention encoder and explore three different decoders: linear, single layer maxout, and highway maxout network. We train and evaluate our model using the recently published Stanford Question and Answering Dataset (SQuAD). Out ...
متن کاملLearning to Rank Question-Answer Pairs using Hierarchical Recurrent Encoder with Latent Topic Clustering
In this paper, we propose a novel end-to-end neural architecture for ranking candidate answers, that adapts a hierarchical recurrent neural network and a latent topic clustering module. With our proposed model, a text is encoded to a vector representation from an wordlevel to a chunk-level to effectively capture the entire meaning. In particular, by adapting the hierarchical structure, our mode...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- CoRR
دوره abs/1603.07044 شماره
صفحات -
تاریخ انتشار 2016