نتایج جستجو برای: lstm

تعداد نتایج: 6907  

2015
Yajie Miao Florian Metze

Long Short-Term Memory (LSTM) is a recurrent neural network (RNN) architecture specializing in modeling long-range temporal dynamics. On acoustic modeling tasks, LSTM-RNNs have shown better performance than DNNs and conventional RNNs. In this paper, we conduct an extensive study on speaker adaptation of LSTM-RNNs. Speaker adaptation helps to reduce the mismatch between acoustic models and testi...

2016
Xianghua Fu Wangwang Liu Yingying Xu Chong Yu Ting Wang

Using deep learning models to solve sentiment analysis of sentences is still a challenging task. Long short-term memory (LSTM) network solves the gradient disappeared problem existed in recurrent neural network (RNN), but LSTM structure is linear chain-structure that can’t capture text structure information. Afterwards, Tree-LSTM is proposed, which uses LSTM forget gate to skip sub-trees that h...

Journal: :CoRR 2015
Chunting Zhou Chonglin Sun Zhiyuan Liu Francis C. M. Lau

Neural network models have been demonstrated to be capable of achieving remarkable performance in sentence and document modeling. Convolutional neural network (CNN) and recurrent neural network (RNN) are two mainstream architectures for such modeling tasks, which adopt totally different ways of understanding natural languages. In this work, we combine the strengths of both architectures and pro...

Journal: :CoRR 2017
AbdElRahman ElSaid Travis Desell Fatima El Jamiy James Higgins Brandon Wild

This article expands on research that has been done to develop a recurrent neural network (RNN) capable of predicting aircraft engine vibrations using long short-term memory (LSTM) neurons. LSTM RNNs can provide a more generalizable and robust method for prediction over analytical calculations of engine vibration, as analytical calculations must be solved iteratively based on specific empirical...

2016
Tri Dao

We investigate the effect of ensembling on two simple models: LSTM and bidirectional LSTM. These models are used for fine-grained sentiment classification on the Stanford Sentiment Treebank dataset. We observe that ensembling improves the classification accuracy by about 3% over single models. Moreover, the more complex model, bidirectional LSTM, benefits more from ensembling.

2016
Shyam Sundar Rajagopalan Louis-Philippe Morency Tadas Baltrusaitis Roland Göcke

Long Short-Term Memory (LSTM) networks have been successfully applied to a number of sequence learning problems but they lack the design flexibility to model multiple view interactions, limiting their ability to exploit multi-view relationships. In this paper, we propose a Multi-View LSTM (MV-LSTM), which explicitly models the view-specific and cross-view interactions over time or structured ou...

Journal: :CoRR 2016
Dingkun Long Richong Zhang Yongyi Mao

The difficulty in analyzing LSTM-like recurrent neural networks lies in the complex structure of the recurrent unit, which induces highly complex nonlinear dynamics. In this paper, we design a new simple recurrent unit, which we call Prototypical Recurrent Unit (PRU). We verify experimentally that PRU performs comparably to LSTM and GRU. This potentially enables PRU to be a prototypical example...

Journal: :CoRR 2017
Oleksii Kuchaiev Boris Ginsburg

We present two simple ways of reducing the number of parameters and accelerating the training of large Long Short-Term Memory (LSTM) networks: the first one is ”matrix factorization by design” of LSTM matrix into the product of two smaller matrices, and the second one is partitioning of LSTM matrix, its inputs and states into the independent groups. Both approaches allow us to train large LSTM ...

2016
Heiga Zen Yannis Agiomyrgiannakis Niels Egberts Fergus Henderson Przemyslaw Szczepaniak

Acoustic models based on long short-term memory recurrent neural networks (LSTM-RNNs) were applied to statistical parametric speech synthesis (SPSS) and showed significant improvements in naturalness and latency over those based on hidden Markov models (HMMs). This paper describes further optimizations of LSTM-RNN-based SPSS for deployment on mobile devices; weight quantization, multi-frame inf...

Journal: :CoRR 2017
Xiaochen Chen Lai Wei Jiaxin Xu

In this paper, we use the house price data ranging from January 2004 to October 2016 to predict the average house price of November and December in 2016 for each district in Beijing, Shanghai, Guangzhou and Shenzhen. We apply Autoregressive Integrated Moving Average model to generate the baseline while LSTM networks to build prediction model. These algorithms are compared in terms of Mean Squar...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید