نتایج جستجو برای: natural network

تعداد نتایج: 1128616  

2016
Jian Fu Xipeng Qiu Xuanjing Huang

Document-based Question Answering aims to compute the similarity or relevance between two texts: question and answer. It is a typical and core task and considered as a touchstone of natural language understanding. In this article, we present a convolutional neural network based architecture to learn feature representations of each questionanswer pair and compute its match score. By taking the i...

2016
Hao Zhou Minlie Huang Xiaoyan Zhu

Natural language generation (NLG) is an important component of question answering(QA) systems which has a significant impact on system quality. Most tranditional QA systems based on templates or rules tend to generate rigid and stylised responses without the natural variation of human language. Furthermore, such methods need an amount of work to generate the templates or rules. To address this ...

Journal: :CoRR 2016
Eunsol Choi Daniel Hewlett Alexandre Lacoste Illia Polosukhin Jakob Uszkoreit Jonathan Berant

Reading an article and answering questions about its content is a fundamental task for natural language understanding. While most successful neural approaches to this problem rely on recurrent neural networks (RNNs), training RNNs over long documents can be prohibitively slow. We present a novel framework for question answering that can efficiently scale to longer documents while maintaining or...

2012
Tsung-Ting Kuo San-Chuan Hung Wei-Shih Lin Nanyun Peng Shou-De Lin Wei-Fen Lin

This paper brings a marriage of two seemly unrelated topics, natural language processing (NLP) and social network analysis (SNA). We propose a new task in SNA which is to predict the diffusion of a new topic, and design a learning-based framework to solve this problem. We exploit the latent semantic information among users, topics, and social connections as features for prediction. Our framewor...

2017
Qian Chen Xiao-Dan Zhu Zhen-Hua Ling Si Wei Hui Jiang Diana Inkpen

The RepEval 2017 Shared Task aims to evaluate natural language understanding models for sentence representation, in which a sentence is represented as a fixedlength vector with neural networks and the quality of the representation is tested with a natural language inference task. This paper describes our system (alpha) that is ranked among the top in the Shared Task, on both the in-domain test ...

Journal: :CoRR 2016
Wang Ling Phil Blunsom Edward Grefenstette Karl Moritz Hermann Tomás Kociský Fumin Wang Andrew Senior

Many language generation tasks require the production of text conditioned on both structured and unstructured inputs. We present a novel neural network architecture which generates an output sequence conditioned on an arbitrary number of input functions. Crucially, our approach allows both the choice of conditioning context and the granularity of generation, for example characters or tokens, to...

Journal: :CoRR 2017
Juan Andrés Laura Gabriel Masi Luis Argerich

In recent studies [1][13][12] Recurrent Neural Networks were used for generative processes and their surprising performance can be explained by their ability to create good predictions. In addition, data compression is also based on predictions. What the problem comes down to is whether a data compressor could be used to perform as well as recurrent neural networks in natural language processin...

Journal: :Journal of Environmental Accounting and Management 2018

Journal: :Geomechanics and Geophysics for Geo-Energy and Geo-Resources 2016

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید