IDEAS home Printed from https://ideas.repec.org/a/plo/pcbi00/1011354.html
   My bibliography  Save this article

Attractor neural networks with double well synapses

Author

Listed:
  • Yu Feng
  • Nicolas Brunel

Abstract

It is widely believed that memory storage depends on activity-dependent synaptic modifications. Classical studies of learning and memory in neural networks describe synaptic efficacy either as continuous or discrete. However, recent results suggest an intermediate scenario in which synaptic efficacy can be described by a continuous variable, but whose distribution is peaked around a small set of discrete values. Motivated by these results, we explored a model in which each synapse is described by a continuous variable that evolves in a potential with multiple minima. External inputs to the network can switch synapses from one potential well to another. Our analytical and numerical results show that this model can interpolate between models with discrete synapses which correspond to the deep potential limit, and models in which synapses evolve in a single quadratic potential. We find that the storage capacity of the network with double well synapses exhibits a power law dependence on the network size, rather than the logarithmic dependence observed in models with single well synapses. In addition, synapses with deeper potential wells lead to more robust information storage in the presence of noise. When memories are sparsely encoded, the scaling of the capacity with network size is similar to previously studied network models in the sparse coding limit.Author summary: A long-lasting question in neuroscience is whether synaptic efficacies should be described as continuous variable or discrete variables. Recent experiments indicate that it is a combination of both: synaptic efficacy changes continuously, but its distribution peaks at several discrete values. In this study, we introduce a synapse model described by a double well potential, and investigate the memory properties of networks of neurons connected with such synapses. Our results show in networks with a bimodal weight distribution, the storage capacity depends on network size as a power law. In addition, we demonstrate that networks with such synapses store information more robustly in the presence of noise, compared to networks with synapses with a single well potential.

Suggested Citation

  • Yu Feng & Nicolas Brunel, 2024. "Attractor neural networks with double well synapses," PLOS Computational Biology, Public Library of Science, vol. 20(2), pages 1-25, February.
  • Handle: RePEc:plo:pcbi00:1011354
    DOI: 10.1371/journal.pcbi.1011354
    as

    Download full text from publisher

    File URL: https://journals.plos.org/ploscompbiol/article?id=10.1371/journal.pcbi.1011354
    Download Restriction: no

    File URL: https://journals.plos.org/ploscompbiol/article/file?id=10.1371/journal.pcbi.1011354&type=printable
    Download Restriction: no

    File URL: https://libkey.io/10.1371/journal.pcbi.1011354?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:plo:pcbi00:1011354. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: ploscompbiol (email available below). General contact details of provider: https://journals.plos.org/ploscompbiol/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.