نتایج جستجو برای: lstm

تعداد نتایج: 6907  

Journal: :Bioinformatics 2007
Sepp Hochreiter Martin Heusel Klaus Obermayer

MOTIVATION As more genomes are sequenced, the demand for fast gene classification techniques is increasing. To analyze a newly sequenced genome, first the genes are identified and translated into amino acid sequences which are then classified into structural or functional classes. The best-performing protein classification methods are based on protein homology detection using sequence alignment...

2016
Xiaodan Liang Xiaohui Shen Jiashi Feng Liang Lin Shuicheng Yan

By taking the semantic object parsing task as an exemplar application scenario, we propose the Graph Long Short-Term Memory (Graph LSTM) network, which is the generalization of LSTM from sequential data or multidimensional data to general graph-structured data. Particularly, instead of evenly and fixedly dividing an image to pixels or patches in existing multi-dimensional LSTM structures (e.g.,...

Journal: :CoRR 2015
Nal Kalchbrenner Ivo Danihelka Alex Graves

This paper introduces Grid Long Short-Term Memory, a network of LSTM cells arranged in a multidimensional grid that can be applied to vectors, sequences or higher dimensional data such as images. The network differs from existing deep LSTM architectures in that the cells are connected between network layers as well as along the spatiotemporal dimensions of the data. The network provides a unifi...

Journal: :Bioscience trends 2017
Jie Zhang Kazumitsu Nawata

Worldwide, influenza is estimated to result in approximately 3 to 5 million annual cases of severe illness and approximately 250,000 to 500,000 deaths. We need an accurate time-series model to predict the number of influenza patients. Although time-series models with different time lags as feature spaces could lead to varied accuracy, past studies simply adopted a time lag in their models witho...

2015
Xiangang Li Xihong Wu

Long short-term memory (LSTM) recurrent neural networks (RNNs) have been shown to give state-of-the-art performance on many speech recognition tasks, as they are able to provide the learned dynamically changing contextual window of all sequence history. On the other hand, the convolutional neural networks (CNNs) have brought significant improvements to deep feed-forward neural networks (FFNNs),...

2017
Mohamed Morchid

Long Short-Term Memory (LSTM) Recurrent Neural Networks (RNN) require 4 gates to learn shortand long-term dependencies for a given sequence of basic elements. Recently, “Gated Recurrent Unit” (GRU) has been introduced and requires fewer gates than LSTM (reset and update gates), to code shortand long-term dependencies and reaches equivalent performances to LSTM, with less processing time during ...

2016
Yangyang Shi Kaisheng Yao Le Tian Daxin Jiang

Traditional convolutional neural network (CNN) based query classification uses linear feature mapping in its convolution operation. The recurrent neural network (RNN), differs from a CNN in representing word sequence with their ordering information kept explicitly. We propose using a deep long-short-term-memory (DLSTM) based feature mapping to learn feature representation for CNN. The DLSTM, wh...

2018
Shuo Wang Zhe Li Caiwen Ding Bo Yuan Qinru Qiu Yanzhi Wang Yun Liang

Recently, significant accuracy improvement has been achieved for acoustic recognition systems by increasing the model size of Long Short-Term Memory (LSTM) networks. Unfortunately, the everincreasing size of LSTMmodel leads to inefficient designs on FPGAs due to the limited on-chip resources. The previous work proposes to use a pruning based compression technique to reduce themodel size and thu...

Journal: :CoRR 2017
Stephen Merity Nitish Shirish Keskar Richard Socher

In this paper, we consider the specific problem of word-level language modeling and investigate strategies for regularizing and optimizing LSTM-based models. We propose the weight-dropped LSTM, which uses DropConnect on hidden-tohidden weights, as a form of recurrent regularization. Further, we introduce NTAvSGD, a non-monotonically triggered (NT) variant of the averaged stochastic gradient met...

2017
Ning Chen Shijun Wang

Although Convolutional Neural Networks (CNNs) and Long Short Term Memory (LSTM) have yielded impressive performances in a variety of Music Information Retrieval (MIR) tasks, the complementarity among the CNNs of different architectures and that between CNNs and LSTM are seldom considered. In this paper, multichannel CNNs with different architectures and LSTM are combined into one unified archit...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید