Finite State Automata and Simple Recurrent Networks
نویسندگان
چکیده
منابع مشابه
Finite State Automata and Simple Recurrent Networks
We explore a network architecture introduced by Elman (1988) for predicting successive elements of a sequence. The network uses the pattern of activation over a set of hidden units from time-step t-1, together with element t, to predict element t + 1. When the network is trained with strings from a particular finite-state grammar, it can learn to be a perfect finite-state recognizer for the gra...
متن کاملInduction of Finite-State Automata Using Second-Order Recurrent Networks
Second-order recurrent networks that recognize simple finite state languages over {0,1}* are induced from positive and negative examples. Using the complete gradient of the recurrent network and sufficient training examples to constrain the definition of the language to be induced, solutions are obtained that correctly recognize strings of arbitrary length. A method for extracting a finite stat...
متن کاملInjecting Nondeterministic Finite State Automata into Recurrent Neural Networks
In this paper we propose a method for injecting time-warping nondeterministic nite state automata into recurrent neural networks. The proposed algorithm takes as input a set of automata transition rules and produces a recurrent architecture. The resulting connection weights are speciied by means of linear constraints. In this way, the network is guaranteed to carry out the assigned automata rul...
متن کاملRecurrent Neural Networks and Finite Automata
This article studies finite size networks that consist of interconnections of synchronously evolving processors. Each processor updates its state by applying an activation function lo a linear combination of the previous states of all units. We prove that any function for which the left and right limits exist and are different can be applied to the neurons to yield a network which is at least a...
متن کاملFinite State Automata that Recurrent Cascade-Correlation Cannot Represent
This paper relates the computational power of Fahlman' s Recurrent Cascade Correlation (RCC) architecture to that of fInite state automata (FSA). While some recurrent networks are FSA equivalent, RCC is not. The paper presents a theoretical analysis of the RCC architecture in the form of a proof describing a large class of FSA which cannot be realized by RCC.
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Neural Computation
سال: 1989
ISSN: 0899-7667,1530-888X
DOI: 10.1162/neco.1989.1.3.372