نتایج جستجو برای: recurrent neural network rnn

تعداد نتایج: 942872  

Journal: :CoRR 2015
Shiliang Zhang Hui Jiang Mingbin Xu Junfeng Hou Li-Rong Dai

In this paper, we propose the new fixedsize ordinally-forgetting encoding (FOFE) method, which can almost uniquely encode any variable-length sequence of words into a fixed-size representation. FOFE can model the word order in a sequence using a simple ordinally-forgetting mechanism according to the positions of words. In this work, we have applied FOFE to feedforward neural network language mo...

2016
Youssef Oualil Clayton Greenberg Mittul Singh Dietrich Klakow

Feedforward Neural Network (FNN)-based language models estimate the probability of the next word based on the history of the last N words, whereas Recurrent Neural Networks (RNN) perform the same task based only on the last word and some context information that cycles in the network. This paper presents a novel approach, which bridges the gap between these two categories of networks. In partic...

Journal: :CoRR 2014
Hasim Sak Andrew W. Senior Françoise Beaufays

Long Short-Term Memory (LSTM) is a recurrent neural network (RNN) architecture that has been designed to address the vanishing and exploding gradient problems of conventional RNNs. Unlike feedforward neural networks, RNNs have cyclic connections making them powerful for modeling sequences. They have been successfully used for sequence labeling and sequence prediction tasks, such as handwriting ...

2015
Shiliang Zhang Hui Jiang Mingbin Xu Junfeng Hou Li-Rong Dai

In this paper, we propose the new fixedsize ordinally-forgetting encoding (FOFE) method, which can almost uniquely encode any variable-length sequence of words into a fixed-size representation. FOFE can model the word order in a sequence using a simple ordinally-forgetting mechanism according to the positions of words. In this work, we have applied FOFE to feedforward neural network language mo...

Journal: :CoRR 2017
Tong Zhang Wenming Zheng Zhen Cui Yuan Zong Yang Li

Emotion analysis is a crucial problem to endow artifact machines with real intelligence in many large potential applications. As external appearances of human emotions, electroencephalogram (EEG) signals and video face signals are widely used to track and analyze human’s affective information. According to their common characteristics of spatial-temporal volumes, in this paper we propose a nove...

1999
Ekrem Varoglu Kadri Hacioglu

A nonlinear predictive model of speech, based on the method of time delay reconstruction, is presented and approximated using a fully connected recurrent neural network (RNN) followed by a linear combiner. This novel combination of the well established approaches for speech analysis and synthesis is compared to traditional techniques within a unified framework to illustrate the advantages of us...

Journal: :CoRR 2017
Shuai Li Wanqing Li Chris Cook Ce Zhu Yanbo Gao

Pooling is an important component in convolutional neural networks (CNNs) for aggregating features and reducing computational burden. Compared with other components such as convolutional layers and fully connected layers which are completely learned from data, the pooling component is still handcrafted such as max pooling and average pooling. This paper proposes a learnable pooling function usi...

1998
Peter Tiño Georg Dorffner

We suggest a recurrent neural network (RNN) model with a recurrent part corresponding to iterative function systems (IFS) introduced by Barnsley [1] as a fractal image compression mechanism. The key idea is that 1) in our model we avoid learning the RNN state part by having non-trainable connections between the context and recurrent layers (this makes the training process less problematic and f...

2015
Rafal Józefowicz Wojciech Zaremba Ilya Sutskever

The Recurrent Neural Network (RNN) is an extremely powerful sequence model that is often difficult to train. The Long Short-Term Memory (LSTM) is a specific RNN architecture whose design makes it much easier to train. While wildly successful in practice, the LSTM’s architecture appears to be ad-hoc so it is not clear if it is optimal, and the significance of its individual components is unclear...

Journal: :CoRR 2018
Andros Tjandra Sakriani Sakti Satoshi Nakamura

In the machine learning fields, Recurrent Neural Network (RNN) has become a popular algorithm for sequential data modeling. However, behind the impressive performance, RNNs require a large number of parameters for both training and inference. In this paper, we are trying to reduce the number of parameters and maintain the expressive power from RNN simultaneously. We utilize several tensor decom...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید