نتایج جستجو برای: recurrent neural network rnn

تعداد نتایج: 942872  

2012
Wu Yilei Yang Xulei

In the past decades, Recurrent Neural Network (RNN) has attracted extensive research interests in various disciplines. One important motivation of these investigations is the RNN's promising ability of modeling time-behavior of nonlinear dynamic systems. It has been theoretically proved that RNN is able to map arbitrary input sequences to output sequences with infinite accuracy regardless under...

2006
Le Yang Yanbo Xue

In this report, we developed a new recurrent neural network toolbox, including the recurrent multilayer perceptron structure and its companying extended Kalman filter based training algorithms: BPTT-GEKF and BPTT-DEKF. Besides, we also constructed programs for designing echo state network with single reservoir, together with the offline linear regression based training algorithm. We name this t...

Journal: :Human brain mapping 2018
Junxing Shi Haiguang Wen Yizhen Zhang Kuan Han Zhongming Liu

The human visual cortex extracts both spatial and temporal visual features to support perception and guide behavior. Deep convolutional neural networks (CNNs) provide a computational framework to model cortical representation and organization for spatial visual processing, but unable to explain how the brain processes temporal information. To overcome this limitation, we extended a CNN by addin...

2007
Mohammad Nurul Huda Muhammad Ghulam Junsei Horikawa Tsuneo Nitta

Segmentation of speech into its corresponding phones has become very important issue in many speech processing areas such as speech recognition, speech analysis, speech synthesis, and speech database. In this paper, for accurate segmentation in speech recognition applications, we introduce Distinctive Phonetic Feature (DPF) based feature extraction using a twostage NN (Neural Networks) system c...

Journal: :CoRR 2015
Chunting Zhou Chonglin Sun Zhiyuan Liu Francis C. M. Lau

Neural network models have been demonstrated to be capable of achieving remarkable performance in sentence and document modeling. Convolutional neural network (CNN) and recurrent neural network (RNN) are two mainstream architectures for such modeling tasks, which adopt totally different ways of understanding natural languages. In this work, we combine the strengths of both architectures and pro...

2005
Jürgen Schmidhuber Daan Wierstra Faustino J. Gomez

Current Neural Network learning algorithms are limited in their ability to model non-linear dynamical systems. Most supervised gradient-based recurrent neural networks (RNNs) suffer from a vanishing error signal that prevents learning from inputs far in the past. Those that do not, still have problems when there are numerous local minima. We introduce a general framework for sequence learning, ...

2017
Mickael Rouvier

This paper describes the system developed at LIA for the SemEval-2017 evaluation campaign. The goal of Task 4.A was to identify sentiment polarity in tweets. The system is an ensemble of Deep Neural Network (DNN) models: Convolutional Neural Network (CNN) and Recurrent Neural Network Long Short-Term Memory (RNN-LSTM). We initialize the input representation of DNN with different sets of embeddin...

Journal: :Neural computation 2017
Alexander Ororbia Tomas Mikolov David Reitter

Learning useful information across long time lags is a critical and difficult problem for temporal neural models in tasks such as language modeling. Existing architectures that address the issue are often complex and costly to train. The differential state framework (DSF) is a simple and high-performing design that unifies previously introduced gated neural models. DSF models maintain longer-te...

Journal: :CoRR 2017
Pankaj Gupta Subburam Rajaram Hinrich Schütze Bernt Andrassy

Dynamic topic modeling facilitates the identification of topical trends over time in temporal collections of unstructured documents. We introduce a novel unsupervised neural dynamic topic model known as Recurrent Neural Network-Replicated Softmax Model (RNNRSM), where the discovered topics at each time influence the topic discovery in the subsequent time steps. We account for the temporal order...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید