نتایج جستجو برای: recurrent input

تعداد نتایج: 345825  

2007
Jarosław DRAPAŁA Jerzy ŚWIĄTEK

In this paper, the problem of modeling and identification of complex input-output systems using recurrent neural networks is discussed. In such a system, we can distinguish a sub process (elementary process) with some inputs and outputs, which can operate separately. Connection between inputs and outputs of each element gives us a complex system. Each element of the complex system is modeled by...

2012
William Yang Wang

The task of part-of-speech (POS) language modeling typically includes a very small vocabulary, which significantly differs from traditional lexicalized language modeling tasks. In this project, we propose a high-order n-gram model and a stateof-the-art recurrent neural network model, which aims at minimizing the variance in this POS language modeling task. In our experiments, we show that the r...

Journal: :Neural computation 2012
Michiel Hermans Benjamin Schrauwen

Echo state networks (ESNs) are large, random recurrent neural networks with a single trained linear readout layer. Despite the untrained nature of the recurrent weights, they are capable of performing universal computations on temporal input data, which makes them interesting for both theoretical research and practical applications. The key to their success lies in the fact that the network com...

2015
Andrew M. Dai Quoc V. Le

We present two approaches that use unlabeled data to improve sequence learning with recurrent networks. The first approach is to predict what comes next in a sequence, which is a conventional language model in natural language processing. The second approach is to use a sequence autoencoder, which reads the input sequence into a vector and predicts the input sequence again. These two algorithms...

2018
Brian DePasquale Christopher J. Cueva Kanaka Rajan G. Sean Escola L. F. Abbott

Trained recurrent networks are powerful tools for modeling dynamic neural computations. We present a target-based method for modifying the full connectivity matrix of a recurrent network to train it to perform tasks involving temporally complex input/output transformations. The method introduces a second network during training to provide suitable "target" dynamics useful for performing the tas...

Journal: :CoRR 2017
Zhongliang Li Raymond Kulhanek Shaojun Wang Yunxin Zhao Shuang Wu

Recurrent neural language models are the state-of-the-art models for language modeling. When the vocabulary size is large, the space taken to store the model parameters becomes the bottleneck for the use of recurrent neural language models. In this paper, we introduce a simple space compression method that randomly shares the structured parameters at both the input and output embedding layers o...

1995
M W Mak

This paper demonstrates a speaker identification system based on recurrent neural networks trained with the Real-time Recurrent Learning algorithm (RTRL). A series of speaker identification experiments based on isolated digits has been conducted. The database contains four utterances of ten digits spoken by ten speakers over a period of nine months. The results suggest that recurrent networks c...

1997
Timo Koskela Markus Varsta Jukka Heikkonen Kimmo Kaski

A newly proposed Recurrent Self-Organizing Map (RSOM) is studied in time series prediction. In this approach RSOM is used to cluster the data to local data sets and local linear models corresponding each of the map units are then estimated based on the local data sets. A traditional way of clustering the data is to use a windowing technique to split it to input vectors of certain length. In thi...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید