A Scalable Architecture for Binary Couplings Attractor Neural Networks
نویسنده
چکیده
This paper presents a digital architecture with on-chip learning for Hoppeld attractor neural networks with binary weights. A new learning rule for the binary weights network is proposed that allows pattern storage up to capacity = 0:4 and incurs very low hardware overhead. Due to the use of binary couplings the network has minimal storage requirements. A exible communication structure allows to cascade multiple chips in order to build fully connected, block connected, or feed-forward networks. System performance and communication bandwidth scale linear with the number of chips. A prototype chip has been fabricated and is fully functional. A pattern recognition application shows the performance of the binary couplings network.
منابع مشابه
Dynamic configuration and collaborative scheduling in supply chains based on scalable multi-agent architecture
Due to diversified and frequently changing demands from customers, technological advances and global competition, manufacturers rely on collaboration with their business partners to share costs, risks and expertise. How to take advantage of advancement of technologies to effectively support operations and create competitive advantage is critical for manufacturers to survive. To respond to these...
متن کاملPrediction of true critical temperature and pressure of binary hydrocarbon mixtures: A Comparison between the artificial neural networks and the support vector machine
Two main objectives have been considered in this paper: providing a good model to predict the critical temperature and pressure of binary hydrocarbon mixtures, and comparing the efficiency of the artificial neural network algorithms and the support vector regression as two commonly used soft computing methods. In order to have a fair comparison and to achieve the highest efficiency, a comprehen...
متن کاملAttractor Networks for Shape Recognition
We describe a system of thousands of binary perceptrons with coarse-oriented edges as input that is able to recognize shapes, even in a context with hundreds of classes. The perceptrons have randomized feedforward connections from the input layer and form a recurrent network among themselves. Each class is represented by a prelearned attractor (serving as an associative hook) in the recurrent n...
متن کاملStructure and Dynamics of Random Recurrent Neural Networks
In contradiction with Hopfield-like networks, random recurrent neural networks (RRNN), where the couplings are random, exhibit complex dynamics (limit cycles, chaos). It is possible to store information in these networks through hebbian learning. Eventually, learning “destroys” the dynamics and leads to a fixed point attractor. We investigate here the structural change in the networks through l...
متن کاملAn exact learning algorithm for autoassociative neural networks with binary couplings
Exact solutions for the learning problem of autoassociative networks with binary couplings are determined by a new method: The use of a branch-and-bound algorithm leads to a substantial saving of computing time compared to complete enumeration. As a result, fully connected networks with up to 40 neurons could be investigated. The network capacity is found to be close to 0.83 . PACS numbers: 84....
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 1996