Distributed Representations, Simple Recurrent Networks, and Grammatical Structure

ثبت نشده
چکیده

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Neural Network Based Grammatical Learning and its Application for Structure Identification

Structure identification has been used widely in many contexts. Grammatical Learning methods are used to find structure information through sequences. Due to negative results, alternative representations have to be used for Grammatical Learning. One such representation is recurrent neural network. Recurrent neural networks are proposed as extended automata. In this chapter, we first summarize r...

متن کامل

Recent Advances of Grammatical Inference

In this paper, we provide a survey of recent advances in the field “Grammatical Inference” with a particular emphasis on the results concerning the learnability of target classes represented by deterministic finite automata, context-free grammars, hidden Markov models, stochastic contextfree grammars, simple recurrent neural networks, and case-based representations.

متن کامل

Solving Linear Semi-Infinite Programming Problems Using Recurrent Neural Networks

‎Linear semi-infinite programming problem is an important class of optimization problems which deals with infinite constraints‎. ‎In this paper‎, ‎to solve this problem‎, ‎we combine a discretization method and a neural network method‎. ‎By a simple discretization of the infinite constraints,we convert the linear semi-infinite programming problem into linear programming problem‎. ‎Then‎, ‎we use...

متن کامل

Tail-Recursive Distributed Representations and Simple Recurrent Networks

Representation poses important challenges to connectionism. The ability to structurally compose representations is critical in achieving the capability considered necessary for cognition. We are investigating distributed patterns that represent structure as part of a larger effort to develop a natural language processor. Recursive Auto-Associative Memory (RAAM) representations show unusual prom...

متن کامل

Thematic Representation in Simple Recurrent Networks

Introduction Simple recurrent networks (SRNs) are able to learn and represent lexical classes (Elman, 1990) and grammatical knowledge, such as agreement and argument structure (Elman, 1991), on the basis of co-occurrence regularities embedded in simple and complex sentences. In the present study, we address the question whether SRNs can represent differences in the thematic roles assigned by ve...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1991