نتایج جستجو برای: natural network

تعداد نتایج: 1128616  

Journal: :Decision Support Systems 2017
Joerg Evermann Jana-Rebecca Rehse Peter Fettke

Predicting business process behaviour is an important aspect of business process management. Motivated by research in natural language processing, this paper describes an application of deep learning with recurrent neural networks to the problem of predicting the next event in a business process. This is both a novel method in process prediction, which has largely relied on explicit process mod...

2015
Qiao Qian Bo Tian Minlie Huang Yang Liu Xuan Zhu Xiaoyan Zhu

Recursive neural network is one of the most successful deep learning models for natural language processing due to the compositional nature of text. The model recursively composes the vector of a parent phrase from those of child words or phrases, with a key component named composition function. Although a variety of composition functions have been proposed, the syntactic information has not be...

2017
Van-Khanh Tran Le-Minh Nguyen Satoshi Tojo

Natural language generation (NLG) is an important component in spoken dialogue systems. This paper presents a model called Encoder-Aggregator-Decoder which is an extension of an Recurrent Neural Network based Encoder-Decoder architecture. The proposed Semantic Aggregator consists of two components: an Aligner and a Refiner. The Aligner is a conventional attention calculated over the encoded inp...

2017
Jan A. Botha Emily Pitler Ji Ma Anton Bakalov Alex Salcianu David Weiss Ryan T. McDonald Slav Petrov

We show that small and shallow feedforward neural networks can achieve near state-of-the-art results on a range of unstructured and structured language processing tasks while being considerably cheaper in memory and computational requirements than deep recurrent models. Motivated by resource-constrained environments like mobile phones, we showcase simple techniques for obtaining such small neur...

2016
Xiaocheng Feng Lifu Huang Duyu Tang Heng Ji Bing Qin Ting Liu

Event detection remains a challenge due to the difficulty at encoding the word semantics in various contexts. Previous approaches heavily depend on languagespecific knowledge and pre-existing natural language processing (NLP) tools. However, compared to English, not all languages have such resources and tools available. A more promising approach is to automatically learn effective features from...

2014
Siyu Qiu Qing Cui Jiang Bian Bin Gao Tie-Yan Liu

The techniques of using neural networks to learn distributed word representations (i.e., word embeddings) have been used to solve a variety of natural language processing tasks. The recently proposed methods, such as CBOW and Skip-gram, have demonstrated their effectiveness in learning word embeddings based on context information such that the obtained word embeddings can capture both semantic ...

2017
Van-Khanh Tran Le-Minh Nguyen

Natural language generation (NLG) plays a critical role in spoken dialogue systems. This paper presents a new approach to NLG by using recurrent neural networks (RNN), in which a gating mechanism is applied before RNN computation. This allows the proposed model to generate appropriate sentences. The RNN-based generator can be learned from unaligned data by jointly training sentence planning and...

2016
Zhe Wang Wei He Hua Wu Haiyang Wu Wei Li Haifeng Wang Enhong Chen

Chinese poetry generation is a very challenging task in natural language processing. In this paper, we propose a novel two-stage poetry generating method which first plans the sub-topics of the poem according to the user’s writing intent, and then generates each line of the poem sequentially, using a modified recurrent neural network encoder-decoder framework. The proposed planningbased method ...

2017
Yinchong Yang Denis Krompass Volker Tresp

The Recurrent Neural Networks and their variants have shown promising performances in sequence modeling tasks such as Natural Language Processing. These models, however, turn out to be impractical and difficult to train when exposed to very high-dimensional inputs due to the large input-to-hidden weight matrix. This may have prevented RNNs’ large-scale application in tasks that involve very hig...

1997
Douglas L. T. Rohde David C. Plaut

Prediction is believed to be an important component of cognition, particularly in the processing of natural language. It has long been accepted that recurrent neural networks are best able to learn prediction tasks when trained on simple examples before incrementally proceeding to more complex sentences. Furthermore, the counter-intuitive suggestion has been made that networks and, by implicati...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید