نتایج جستجو برای: LSTM

تعداد نتایج: 6907  

پایان نامه :وزارت علوم، تحقیقات و فناوری - دانشگاه شیراز - دانشکده علوم 1391

در بسیاری از کاربردها مسئله زمان عاملی تعیین کننده است، به طوری که خروجی سیستم در هر لحظه تابعی از ورودی سیستم و نیز خروجی سیستم در زمان های قبل می باشد. در بعضی موارد، خروجی سیستم حتی ممکن است به ورودی سیستم در زمان های قبل هم بستگی داشته باشد. برای مدل کردن چنین سیستم هایی به وسیله شبکه های عصبی، بازنمایی زمان در عملکرد این شبکه ها اجتناب ناپذیر است. زمان به دو صورت در عملکرد شبکه های عصبی قا...

پیش‌بینی طولانی‌مدت سری‌های زمانی یک مسئله، مهم و چالش‌برانگیز است. امروزه شبکه‌های عمیق به‌خصوص شبکه‌های حافظۀ طولانی کوتاه‌مدت  (LSTM)، با موفقیت در پیش‌بینی سری‌های زمانی به کار گرفته‌ شده‌اند. شبکه‌های LSTM وابستگی‌های طولانی‌مدت را حفظ می‌کنند؛ اما توانایی آنها در اختصاص درجه‌های مختلف توجه به ویژگی‌های زیر پنجره در چند مرحلۀ زمانی کافی نیست. همچنین، عملکرد این شبکه‌ها به‌شدت به مقادیر ابر...

Journal: :CoRR 2017
Fei Tao Gang Liu

Long short-term memory (LSTM) is normally used in recurrent neural network (RNN) as basic recurrent unit. However, conventional LSTM assumes that the state at current time step depends on previous time step. This assumption constraints the time dependency modeling capability. In this study, we propose a new variation of LSTM, advanced LSTM (A-LSTM), for better temporal context modeling. We empl...

Journal: :CoRR 2015
Zhiheng Huang Wei Xu Kai Yu

In this paper, we propose a variety of Long Short-Term Memory (LSTM) based models for sequence tagging. These models include LSTM networks, bidirectional LSTM (BI-LSTM) networks, LSTM with a Conditional Random Field (CRF) layer (LSTM-CRF) and bidirectional LSTM with a CRF layer (BI-LSTM-CRF). Our work is the first to apply a bidirectional LSTM CRF (denoted as BI-LSTM-CRF) model to NLP benchmark...

2017
Jaeyoung Kim Mostafa El-Khamy Jungwon Lee

In this paper, a novel architecture for a deep recurrent neural network, residual LSTM is introduced. A plain LSTM has an internal memory cell that can learn long term dependencies of sequential data. It also provides a temporal shortcut path to avoid vanishing or exploding gradients in the temporal domain. The residual LSTM provides an additional spatial shortcut path from lower layers for eff...

2017
Tiehang Duan Sargur N. Srihari

A deep network structure is formed with LSTM layer and convolutional layer interweaves with each other. The Layerwise Interweaving Convolutional LSTM(LIC-LSTM) enhanced the feature extraction ability of LSTM stack and is capable for versatile sequential data modeling. Its unique network structure allows it to extract higher level features with sequential information involved. Experiment results...

Journal: :CoRR 2018
Pantelis R. Vlachas Wonmin Byeon Zhong Y. Wan Themistoklis P. Sapsis Petros Koumoutsakos

We introduce a data-driven forecasting method for high dimensional, chaotic systems using Long-Short Term Memory (LSTM) recurrent neural networks. The proposed LSTM neural networks perform inference of high dimensional dynamical systems in their reduced order space and are shown to be an effective set of non-linear approximators of their attractor. We demonstrate the forecasting performance of ...

2017
Feng Qian Lei Sha Baobao Chang Lu-chen Liu Ming Zhang

In Semantic Role Labeling (SRL) task, the tree structured dependency relation is rich in syntax information, but it is not well handled by existing models. In this paper, we propose Syntax Aware Long Short Time Memory (SA-LSTM). The structure of SA-LSTM changes according to dependency structure of each sentence, so that SA-LSTM can model the whole tree structure of dependency relation in an arc...

2017
Yujun Lin Song Han William J. Dally

Long Short-Term Memory (LSTM) is widely used to solve sequence modeling problems, for example, image captioning. We found the LSTM cells are heavily redundant. We adopt network pruning to reduce the redundancy of LSTM and introduce sparsity as new regularization to reduce overfitting. We can achieve better performance than the dense baseline while reducing the total number of parameters in LSTM...

Journal: :Neural networks : the official journal of the International Neural Network Society 2012
Derek Monner James A. Reggia

The long short term memory (LSTM) is a second-order recurrent neural network architecture that excels at storing sequential short-term memories and retrieving them many time-steps later. LSTM's original training algorithm provides the important properties of spatial and temporal locality, which are missing from other training approaches, at the cost of limiting its applicability to a small set ...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید