Optimal computation with attractor networks.
نویسندگان
چکیده
We investigate the ability of multi-dimensional attractor networks to perform reliable computations with noisy population codes. We show that such networks can perform computations as reliably as possible--meaning they can reach the Cramér-Rao bound--so long as the noise is small enough. "Small enough" depends on the properties of the noise, especially its correlational structure. For many correlational structures, noise in the range of what is observed in the cortex is sufficiently small that biologically plausible networks can compute optimally. We demonstrate that this result applies to computations that involve cues of varying reliability, such as the position of an object on the retina in bright versus dim light.
منابع مشابه
Localist Attractor Networks Submitted to: Neural Computation
Attractor networks, which map an input space to a discrete output space, are useful for pattern completion—cleaning up noisy or missing input features. However, designing a net to have a given set of attractors is notoriously tricky; training procedures are CPU intensive and often produce spurious attractors and ill-conditioned attractor basins. These difficulties occur because each connection ...
متن کاملLocalist Attractor Networks
Attractor networks, which map an input space to a discrete output space, are useful for pattern completion--cleaning up noisy or missing input features. However, designing a net to have a given set of attractors is notoriously tricky; training procedures are CPU intensive and often produce spurious attractors and ill-conditioned attractor basins. These difficulties occur because each connection...
متن کاملDesign of Continuous Attractor Networks with Monotonic Tuning Using a Symmetry Principle
Neurons that sustain elevated firing in the absence of stimuli have been found in many neural systems. In graded persistent activity, neurons can sustain firing at many levels, suggesting a widely found type of network dynamics in which networks can relax to any one of a continuum of stationary states. The reproduction of these findings in model networks of nonlinear neurons has turned out to b...
متن کاملNetwork Capacity for Latent Attractor Computation
Attractor networks have been one of the most successful paradigms in neural computation and have been used as models of computation in the nervous system Many experimentally observed phenomena such as coherent population codes contextual representations and replay of learned neural activity patterns are explained well by attractor dynamics Recently we proposed a paradigm called latent attractor...
متن کاملCompetition Between Synaptic Depression and Facilitation in Attractor Neural Networks
We study the effect of competition between short-term synaptic depression and facilitation on the dynamic properties of attractor neural networks, using Monte Carlo simulation and a mean-field analysis. Depending on the balance of depression, facilitation, and the underlying noise, the network displays different behaviors, including associative memory and switching of activity between different...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- Journal of physiology, Paris
دوره 97 4-6 شماره
صفحات -
تاریخ انتشار 2003