A generalized LSTM-like training algorithm for second-order recurrent neural networks

نویسندگان
چکیده

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A generalized LSTM-like training algorithm for second-order recurrent neural networks

The long short term memory (LSTM) is a second-order recurrent neural network architecture that excels at storing sequential short-term memories and retrieving them many time-steps later. LSTM's original training algorithm provides the important properties of spatial and temporal locality, which are missing from other training approaches, at the cost of limiting its applicability to a small set ...

متن کامل

General Backpropagation Algorithm for Training Second-order Neural Networks

The artificial neural network is a popular framework in machine learning. To empower individual neurons, we recently suggested that the current type of neurons could be upgraded to second-order counterparts, in which the linear operation between inputs to a neuron and the associated weights is replaced with a nonlinear quadratic operation. A single second-order neurons already have a strong non...

متن کامل

Training Second-Order Recurrent Neural Networks using Hints

We investigate a method for inserting rules into discrete-time second-order recurrent neural networks which are trained to recognize regular languages. The rules deen-ing regular languages can be expressed in the form of transitions in the corresponding deterministic nite-state automaton. Inserting these rules as hints into networks with second-order connections is straightforward. Our simulati...

متن کامل

A Cellular Genetic Algorithm for training Recurrent Neural Networks

Recurrent neural networks (RNNs), with the capability of dealing with spatio-temporal relationship, are more complex than feed-forward neural networks. Training of RNNs by gradient descent methods becomes more dii-cult. Therefore, another training method, which uses cellular genetic algorithms, is proposed. In this paper, the performance of training by a gradient descent method is compared with...

متن کامل

First-order versus second-order single-layer recurrent neural networks

We examine the representational capabilities of first-order and second-order single-layer recurrent neural networks (SLRNN's) with hard-limiting neurons. We show that a second-order SLRNN is strictly more powerful than a first-order SLRNN. However, if the first-order SLRNN is augmented with output layers of feedforward neurons, it can implement any finite-state recognizer, but only if state-spl...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Neural Networks

سال: 2012

ISSN: 0893-6080

DOI: 10.1016/j.neunet.2011.07.003