نتایج جستجو برای: lstm

تعداد نتایج: 6907  

2017
Jinlei Zhang Junxiu Liu Yuling Luo Qiang Fu Jinjie Bi Senhui Qiu Yi Cao Xuemei Ding

This paper proposed a chemical substance detection method using the Long Short-Term Memory of Recurrent Neural Networks (LSTM-RNN). The chemical substance data was collected using a mass spectrometer which is a time-series data. The classification accuracy using the LSTM-RNN classifier is 96.84%, which is higher than 75.07% of the ordinary feed forward neural networks. The experimental results ...

2017
Gakuto Kurata Abhinav Sethy Bhuvana Ramabhadran George Saon

While recurrent neural network language models based on Long Short Term Memory (LSTM) have shown good gains in many automatic speech recognition tasks, Convolutional Neural Network (CNN) language models are relatively new and have not been studied in-depth. In this paper we present an empirical comparison of LSTM and CNN language models on English broadcast news and various conversational telep...

2015
Xingjian Shi Zhourong Chen Hao Wang Dit-Yan Yeung Wai-Kin Wong Wang-chun Woo

The goal of precipitation nowcasting is to predict the future rainfall intensity in a local region over a relatively short period of time. Very few previous studies have examined this crucial and challenging weather forecasting problem from the machine learning perspective. In this paper, we formulate precipitation nowcasting as a spatiotemporal sequence forecasting problem in which both the in...

2016
Duyu Tang Bing Qin Ting Liu

We introduce a deep memory network for aspect level sentiment classification. Unlike feature-based SVM and sequential neural models such as LSTM, this approach explicitly captures the importance of each context word when inferring the sentiment polarity of an aspect. Such importance degree and text representation are calculated with multiple computational layers, each of which is a neural atten...

2018
Boyu Wang

Human action recognition and body movement prediction are important tasks. They are different and have traditionally been addressed separately. These tasks, however, provide mutual benefits to each other, and existing methods fail to capture these benefits. In this paper, we propose a method for jointly recognizing the action and predicting the movement of a person. Our method is based on two L...

2016
Wentao Zhu Cuiling Lan Junliang Xing Wenjun Zeng Yanghao Li Li Shen Xiaohui Xie

Skeleton based action recognition distinguishes human actions using the trajectories of skeleton joints, which provide a very good representation for describing actions. Considering that recurrent neural networks (RNNs) with Long Short-Term Memory (LSTM) can learn feature representations and model long-term temporal dependencies automatically, we propose an endto-end fully connected deep LSTM n...

Journal: :CoRR 2014
Hasim Sak Andrew W. Senior Françoise Beaufays

Long Short-Term Memory (LSTM) is a recurrent neural network (RNN) architecture that has been designed to address the vanishing and exploding gradient problems of conventional RNNs. Unlike feedforward neural networks, RNNs have cyclic connections making them powerful for modeling sequences. They have been successfully used for sequence labeling and sequence prediction tasks, such as handwriting ...

2017
You Zhang Hang Yuan Jin Wang Xuejie Zhang

The sentiment analysis in this task aims to indicate the sentiment intensity of the four emotions (e.g. anger, fear, joy, and sadness) expressed in tweets. Compared to the polarity classification, such intensity prediction can provide more finegrained sentiment analysis. In this paper, we present a system that uses a convolutional neural network with long short-term memory (CNN-LSTM) model to c...

Journal: :CoRR 2017
Atra Akandeh Fathi M. Salem

This is part III of three-part work. In parts I and II, we have presented eight variants for simplified Long Short Term Memory (LSTM) recurrent neural networks (RNNs). It is noted that fast computation, specially in constrained computing resources, are an important factor in processing big timesequence data. In this part III paper, we present and evaluate two new LSTM model variants which drama...

2015
Pengfei Liu Xipeng Qiu Xinchi Chen Shiyu Wu Xuanjing Huang

Neural network based methods have obtained great progress on a variety of natural language processing tasks. However, it is still a challenge task to model long texts, such as sentences and documents. In this paper, we propose a multi-timescale long short-termmemory (MT-LSTM) neural network to model long texts. MTLSTM partitions the hidden states of the standard LSTM into several groups. Each g...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید