نتایج جستجو برای: recurrent neural network rnn

تعداد نتایج: 942872  

Journal: :CoRR 2014
Junhua Mao Wei Xu Yi Yang Jiang Wang Alan L. Yuille

In this paper, we present a multimodal Recurrent Neural Network (m-RNN) model for generating novel sentence descriptions to explain the content of images. It directly models the probability distribution of generating a word given previous words and the image. Image descriptions are generated by sampling from this distribution. The model consists of two sub-networks: a deep recurrent neural netw...

Journal: :CoRR 2017
Abdelhadi Azzouni Guy Pujolle

This paper presents NeuTM, a framework for network Traffic Matrix (TM) prediction based on Long Short-Term Memory Recurrent Neural Networks (LSTM RNNs). TM prediction is defined as the problem of estimating future network traffic matrix from the previous and achieved network traffic data. It is widely used in network planning, resource management and network security. Long Short-Term Memory (LS...

Journal: :CoRR 2014
Junhua Mao Wei Xu Yi Yang Jiang Wang Alan L. Yuille

In this paper, we present a multimodal Recurrent Neural Network (m-RNN) model for generating novel image captions. It directly models the probability distribution of generating a word given previous words and an image. Image captions are generated according to this distribution. The model consists of two sub-networks: a deep recurrent neural network for sentences and a deep convolutional networ...

2016
Zhizheng Wu Oliver Watts Simon King

We introduce the Merlin speech synthesis toolkit for neural network-based speech synthesis. The system takes linguistic features as input, and employs neural networks to predict acoustic features, which are then passed to a vocoder to produce the speech waveform. Various neural network architectures are implemented, including a standard feedforward neural network, mixture density neural network...

2017
Despoina Georgiadou Vassilios Diakoloukas Vassilios Tsiaras Vassilios Digalakis

A prevalent and challenging task in spoken language understanding is slot filling. Currently, the best approaches in this domain are based on recurrent neural networks (RNNs). However, in their simplest form, RNNs cannot learn long-term dependencies in the data. In this paper, we propose the use of ClockWork recurrent neural network (CW-RNN) architectures in the slot-filling domain. CW-RNN is a...

2013
Jianshu Chen Li Deng

We present an architecture of a recurrent neural network (RNN) with a fullyconnected deep neural network (DNN) as its feature extractor. The RNN is equipped with both causal temporal prediction and non-causal look-ahead, via auto-regression (AR) and moving-average (MA), respectively. The focus of this paper is a primal-dual training method that formulates the learning of the RNN as a formal opt...

2004
Wei Sun Yaonan Wang

Abstract— A kind of recurrent fuzzy neural network (RFNN) is constructed by using recurrent neural network (RNN) to realize fuzzy inference. In this kind of RFNN, temporal relations are embedded in the network by adding feedback connections on the first layer of the network. And a RFNN based adaptive control (RFNNBAC) is proposed, in which, two RFNN are used to identify and control plant respec...

Journal: :CoRR 2016
Shijian Tang Song Han

Generating natural language descriptions for images is a challenging task. The traditional way is to use the convolutional neural network (CNN) to extract image features, followed by recurrent neural network (RNN) to generate sentences. In this paper, we present a new model that added memory cells to gate the feeding of image features to the deep neural network. The intuition is enabling our mo...

2017
Marc Tanti Albert Gatt Kenneth P. Camilleri

In neural image captioning systems, a recurrent neural network (RNN) is typically viewed as the primary ‘generation’ component. This view suggests that the image features should be ‘injected’ into the RNN. This is in fact the dominant view in the literature. Alternatively, the RNN can instead be viewed as only encoding the previously generated words. This view suggests that the RNN should only ...

Journal: :CoRR 2017
Danhao Zhu Si Shen Xin-Yu Dai Jiajun Chen

Recurrent Neural Network (RNN) has been widely applied for sequence modeling. In RNN, the hidden states at current step are full connected to those at previous step, thus the influence from less related features at previous step may potentially decrease model’s learning ability. We propose a simple technique called parallel cells (PCs) to enhance the learning ability of Recurrent Neural Network...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید