نتایج جستجو برای: recurrent neural network rnn

تعداد نتایج: 942872  

2017
Jakob N. Foerster Justin Gilmer Jascha Sohl-Dickstein Jan Chorowski David Sussillo

There exist many problem domains where the interpretability of neural network models is essential for deployment. Here we introduce a recurrent architecture composed of input-switched affine transformations – in other words an RNN without any explicit nonlinearities, but with inputdependent recurrent weights. This simple form allows the RNN to be analyzed via straightforward linear methods: we ...

2017
Asier Mujika Florian Meier Angelika Steger

Processing sequential data of variable length is a major challenge in a wide range of applications, such as speech recognition, language modeling, generative image modeling and machine translation. Here, we address this challenge by proposing a novel recurrent neural network (RNN) architecture, the Fast-Slow RNN (FS-RNN). The FS-RNN incorporates the strengths of both multiscale RNNs and deep tr...

S.Samavi, V. Tahani and P. Khadivi,

Routing is one of the basic parts of a message passing multiprocessor system. The routing procedure has a great impact on the efficiency of a system. Neural algorithms that are currently in use for computer networks require a large number of neurons. If a specific topology of a multiprocessor network is considered, the number of neurons can be reduced. In this paper a new recurrent neural ne...

2016
Jianhui Chen Wenqiang Dong Minchen Li

In this project, we systematically analyze a deep neural networks based image caption generation method. With an image as the input, the method can output an English sentence describing the content in the image. We analyze three components of the method: convolutional neural network (CNN), recurrent neural network (RNN) and sentence generation. By replacing the CNN part with three state-of-the-...

Journal: :CoRR 2015
Shiliang Zhang Cong Liu Hui Jiang Si Wei Li-Rong Dai Yu Hu

In this paper, we propose a novel neural network structure, namely feedforward sequential memory networks (FSMN), to model long-term dependency in time series without using recurrent feedback. The proposed FSMN is a standard fully-connected feedforward neural network equipped with some learnable memory blocks in its hidden layers. The memory blocks use a tapped-delay line structure to encode th...

2017
Shiyu Chang Yang Zhang Wei Han Mo Yu Xiaoxiao Guo Wei Tan Xiaodong Cui Michael J. Witbrock Mark A. Hasegawa-Johnson Thomas S. Huang

Notoriously, learning with recurrent neural networks (RNNs) on long sequences is a difficult task. There are three major challenges: 1) extracting complex dependencies, 2) vanishing and exploding gradients, and 3) efficient parallelization. In this paper, we introduce a simple yet effective RNN connection structure, the DILATEDRNN, which simultaneously tackles all these challenges. The proposed...

Journal: :CoRR 2017
Shih-Chieh Su

This work studies the entity-wise topical behavior from massive network logs. Both the temporal and the spatial relationships of the behavior are explored with the learning architectures combing the recurrent neural network (RNN) and the convolutional neural network (CNN). To make the behavioral data appropriate for the spatial learning in CNN, several reduction steps are taken to form the topi...

Journal: :International Journal of Current Microbiology and Applied Sciences 2018

Journal: :Computation (Basel) 2023

Rapid industrialization and population growth cause severe water pollution increased demand. The use of FeCu nanoparticles (nanoFeCu) in treating sewage has been proven to be a space-efficient method. objective this work is develop recurrent neural network (RNN) model estimate the performance immobilized nanoFeCu treatment, thereby easing monitoring forecasting quality. In work, data was collec...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید