نتایج جستجو برای: lstm

تعداد نتایج: 6907  

2015
Søren Kaae Sønderby Casper Kaae Sønderby Henrik Nielsen Ole Winther

Machine learning is widely used to analyze biological sequence data. Non-sequential models such as SVMs or feed-forward neural networks are often used although they have no natural way of handling sequences of varying length. Recurrent neural networks such as the long short term memory (LSTM) model on the other hand are designed to handle sequences. In this study we demonstrate that LSTM networ...

2016
Duyu Tang Bing Qin Xiaocheng Feng Ting Liu

Target-dependent sentiment classification remains a challenge: modeling the semantic relatedness of a target with its context words in a sentence. Different context words have different influences on determining the sentiment polarity of a sentence towards the target. Therefore, it is desirable to integrate the connections between target word and context words when building a learning system. I...

2016
Yao Tian Liang He Yi Liu Jia Liu

Recently, the integration of deep neural networks (DNNs) trained to predict senone posteriors with conventional language modeling methods has been proved effective for spoken language recognition. This work extends some of the senone-based DNN frameworks by replacing the DNN with the LSTM RNN. Two of these approaches use the LSTM RNN to generate features. The features are extracted from the rec...

Journal: :CoRR 2015
Duyu Tang Bing Qin Xiaocheng Feng Ting Liu

Target-dependent sentiment classification remains a challenge: modeling the semantic relatedness of a target with its context words in a sentence. Different context words have different influences on determining the sentiment polarity of a sentence towards the target. Therefore, it is desirable to integrate the connections between target word and context words when building a learning system. I...

2015
Ebru Arisoy Murat Saraclar

Long Short-Term Memory (LSTM) neural networks are recurrent neural networks that contain memory units that can store contextual information from past inputs for arbitrary amounts of time. A typical LSTM neural network language model is trained by feeding an input sequence. i.e., a stream of words, to the input layer of the network and the output layer predicts the probability of the next word g...

2016
Zhenlong Yu Caixia Yuan Xiaojie Wang Guohua Yang

This paper presents a dialogue response generator based on long short term memory (LSTM) neural networks for the SLG (Spoken Language Generation) pilot task of DSTC5 [1]. We first encode the input containing different number of semantic units as fixed-length semantic vector with a LSTM encoder. Then we decode the semantic vector with a variant of LSTM and generate corresponding text. In order t...

2018
Chase Roberts Manish Nair

We propose a simple mathematical definition and new neural architecture for finding anomalies within discrete sequence datasets. Our model comprises of a modified LSTM autoencoder and an array of One-Class SVMs. The LSTM takes in elements from a sequence and creates context vectors that are used to predict the probability distribution of the following element. These context vectors are then use...

2002
Juan Antonio Pérez-Ortiz Jürgen Schmidhuber Felix A. Gers Douglas Eck

Long Short-Term Memory (LSTM) recurrent neural networks (RNNs) outperform traditional RNNs when dealing with sequences involving not only short-term but also long-term dependencies. The decoupled extended Kalman filter learning algorithm (DEKF) works well in online environments and reduces significantly the number of training steps when compared to the standard gradient-descent algorithms. Prev...

Journal: :CoRR 2016
Ben Krause Liang Lu Iain Murray Steve Renals

We introduce multiplicative LSTM (mLSTM), a novel recurrent neural network architecture for sequence modelling that combines the long short-term memory (LSTM) and multiplicative recurrent neural network architectures. mLSTM is characterised by its ability to have different recurrent transition functions for each possible input, which we argue makes it more expressive for autoregressive density ...

2002
Felix A. Gers Juan Antonio Pérez-Ortiz Douglas Eck Jürgen Schmidhuber

Unlike traditional recurrent neural networks, the Long ShortTerm Memory (LSTM) model generalizes well when presented with training sequences derived from regular and also simple nonregular languages. Our novel combination of LSTM and the decoupled extended Kalman filter, however, learns even faster and generalizes even better, requiring only the 10 shortest exemplars (n ≤ 10) of the context sen...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید