Palimpsest Memories: a New High-capacity Forgetful Learning Rule for Hoppeld Networks

نویسنده

  • Amos Storkey
چکیده

Preprint Abstract Palimpsest or forgetful learning rules for attractor neural networks do not suuer from catastrophic forgetting. Instead they selectively forget older memories in order to store new patterns. Standard palimpsest learning algorithms have a capacity of up to 0:05n, where n is the size of the network. Here a new learning rule is introduced. This rule is local and incremental. It is shown that it has palimpsest properties, and it has a palimpsest capacity of about 0:25n, much higher than the capacity of standard palimpsest schemes. It is shown that the algorithm acts as an iterated function sequence on the space of matrices, and this is used to illustrate the performance of the learning rule.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Eecient Covariance Matrix Methods for Bayesian Gaussian Processes and Hoppeld Neural Networks

Covariance matrices are important in many areas of neural modelling. In Hop eld networks they are used to form the weight matrix which controls the autoassociative properties of the network. In Gaussian processes, which have been shown to be the in nite neuron limit of many regularised feedforward neural networks, covariance matrices control the form of Bayesian prior distribution over function...

متن کامل

Inhomogeneities in Heteroassociative Memories with Linear Learning Rules

We investigate how various inhomogeneities present in synapses and neurons affect the performance of feedforward associative memories with linear learning, a high-level network model of hippocampal circuitry and plasticity. The inhomogeneities incorporated into the model are differential input attenuation, stochastic synaptic transmission, and memories learned with varying intensity. For a clas...

متن کامل

A palimpsest memory based on an incremental Bayesian learning rule

Capacity limited memory systems need to gradually forget old information in order to avoid catastrophic forgetting where all stored information is lost. This can be achieved by allowing new information to overwrite old, as in the so-called palimpsest memory. This paper describes a new such learning rule employed in an attractor neural network. The network does not exhibit catastrophic forgettin...

متن کامل

Multifractality in forgetful memories

Learning rules of a forgetful memory generate their synaptic efficacies through iterative procedures that operate on the input data, random patterns. We analyse invariant distributions of the synaptic couplings as they arise asymptotically and show that they exhibit fractal or multifractal properties. We also discuss their dependence upon the learning rule and the parameters specifying it, and ...

متن کامل

A Scalable Architecture for Binary Couplings Attractor Neural Networks

This paper presents a digital architecture with on-chip learning for Hoppeld attractor neural networks with binary weights. A new learning rule for the binary weights network is proposed that allows pattern storage up to capacity = 0:4 and incurs very low hardware overhead. Due to the use of binary couplings the network has minimal storage requirements. A exible communication structure allows t...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1998