IDEAS home Printed from https://ideas.repec.org/a/nat/natcom/v13y2022i1d10.1038_s41467-022-29491-2.html
   My bibliography  Save this article

Introducing principles of synaptic integration in the optimization of deep neural networks

Author

Listed:
  • Giorgia Dellaferrera

    (IBM Research - Zurich
    University of Zurich and ETH Zurich)

  • Stanisław Woźniak

    (IBM Research - Zurich)

  • Giacomo Indiveri

    (University of Zurich and ETH Zurich)

  • Angeliki Pantazi

    (IBM Research - Zurich)

  • Evangelos Eleftheriou

    (IBM Research - Zurich
    Axelera AI)

Abstract

Plasticity circuits in the brain are known to be influenced by the distribution of the synaptic weights through the mechanisms of synaptic integration and local regulation of synaptic strength. However, the complex interplay of stimulation-dependent plasticity with local learning signals is disregarded by most of the artificial neural network training algorithms devised so far. Here, we propose a novel biologically inspired optimizer for artificial and spiking neural networks that incorporates key principles of synaptic plasticity observed in cortical dendrites: GRAPES (Group Responsibility for Adjusting the Propagation of Error Signals). GRAPES implements a weight-distribution-dependent modulation of the error signal at each node of the network. We show that this biologically inspired mechanism leads to a substantial improvement of the performance of artificial and spiking networks with feedforward, convolutional, and recurrent architectures, it mitigates catastrophic forgetting, and it is optimally suited for dedicated hardware implementations. Overall, our work indicates that reconciling neurophysiology insights with machine intelligence is key to boosting the performance of neural networks.

Suggested Citation

  • Giorgia Dellaferrera & Stanisław Woźniak & Giacomo Indiveri & Angeliki Pantazi & Evangelos Eleftheriou, 2022. "Introducing principles of synaptic integration in the optimization of deep neural networks," Nature Communications, Nature, vol. 13(1), pages 1-14, December.
  • Handle: RePEc:nat:natcom:v:13:y:2022:i:1:d:10.1038_s41467-022-29491-2
    DOI: 10.1038/s41467-022-29491-2
    as

    Download full text from publisher

    File URL: https://www.nature.com/articles/s41467-022-29491-2
    File Function: Abstract
    Download Restriction: no

    File URL: https://libkey.io/10.1038/s41467-022-29491-2?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Gina G. Turrigiano & Kenneth R. Leslie & Niraj S. Desai & Lana C. Rutherford & Sacha B. Nelson, 1998. "Activity-dependent scaling of quantal amplitude in neocortical neurons," Nature, Nature, vol. 391(6670), pages 892-896, February.
    2. Sébastien Royer & Denis Paré, 2003. "Conservation of total synaptic weight through balanced synaptic depression and potentiation," Nature, Nature, vol. 422(6931), pages 518-522, April.
    3. Timothy P. Lillicrap & Daniel Cownden & Douglas B. Tweed & Colin J. Akerman, 2016. "Random synaptic feedback weights support error backpropagation for deep learning," Nature Communications, Nature, vol. 7(1), pages 1-10, December.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Matteo Saponati & Martin Vinck, 2023. "Sequence anticipation and spike-timing-dependent plasticity emerge from a predictive learning rule," Nature Communications, Nature, vol. 14(1), pages 1-13, December.
    2. Niranjan Chakravarthy & Shivkumar Sabesan & Kostas Tsakalis & Leon Iasemidis, 2009. "Controlling epileptic seizures in a neural mass model," Journal of Combinatorial Optimization, Springer, vol. 17(1), pages 98-116, January.
    3. Sacha Jennifer van Albada & Moritz Helias & Markus Diesmann, 2015. "Scalability of Asynchronous Networks Is Limited by One-to-One Mapping between Effective Connectivity and Correlations," PLOS Computational Biology, Public Library of Science, vol. 11(9), pages 1-37, September.
    4. Aseel Shomar & Lukas Geyrhofer & Noam E Ziv & Naama Brenner, 2017. "Cooperative stochastic binding and unbinding explain synaptic size dynamics and statistics," PLOS Computational Biology, Public Library of Science, vol. 13(7), pages 1-24, July.
    5. Robert Rosenbaum, 2022. "On the relationship between predictive coding and backpropagation," PLOS ONE, Public Library of Science, vol. 17(3), pages 1-27, March.
    6. Navid Shervani-Tabar & Robert Rosenbaum, 2023. "Meta-learning biologically plausible plasticity rules with random feedback pathways," Nature Communications, Nature, vol. 14(1), pages 1-12, December.
    7. Juan Prada & Manju Sasi & Corinna Martin & Sibylle Jablonka & Thomas Dandekar & Robert Blum, 2018. "An open source tool for automatic spatiotemporal assessment of calcium transients and local ‘signal-close-to-noise’ activity in calcium imaging data," PLOS Computational Biology, Public Library of Science, vol. 14(3), pages 1-34, March.
    8. Kendra S Burbank, 2015. "Mirrored STDP Implements Autoencoder Learning in a Network of Spiking Neurons," PLOS Computational Biology, Public Library of Science, vol. 11(12), pages 1-25, December.
    9. Damien M O’Halloran, 2020. "Simulation model of CA1 pyramidal neurons reveal opposing roles for the Na+/Ca2+ exchange current and Ca2+-activated K+ current during spike-timing dependent synaptic plasticity," PLOS ONE, Public Library of Science, vol. 15(3), pages 1-12, March.
    10. Christian Keck & Cristina Savin & Jörg Lücke, 2012. "Feedforward Inhibition and Synaptic Scaling – Two Sides of the Same Coin?," PLOS Computational Biology, Public Library of Science, vol. 8(3), pages 1-15, March.
    11. Iris Reuveni & Sourav Ghosh & Edi Barkai, 2017. "Real Time Multiplicative Memory Amplification Mediated by Whole-Cell Scaling of Synaptic Response in Key Neurons," PLOS Computational Biology, Public Library of Science, vol. 13(1), pages 1-31, January.
    12. Mizusaki, Beatriz E.P. & Agnes, Everton J. & Erichsen, Rubem & Brunnet, Leonardo G., 2017. "Learning and retrieval behavior in recurrent neural networks with pre-synaptic dependent homeostatic plasticity," Physica A: Statistical Mechanics and its Applications, Elsevier, vol. 479(C), pages 279-286.
    13. Keitaro Obara & Teppei Ebina & Shin-Ichiro Terada & Takanori Uka & Misako Komatsu & Masafumi Takaji & Akiya Watakabe & Kenta Kobayashi & Yoshito Masamizu & Hiroaki Mizukami & Tetsuo Yamamori & Kiyoto , 2023. "Change detection in the primate auditory cortex through feedback of prediction error signals," Nature Communications, Nature, vol. 14(1), pages 1-17, December.
    14. John Palmer & Adam Keane & Pulin Gong, 2017. "Learning and executing goal-directed choices by internally generated sequences in spiking neural circuits," PLOS Computational Biology, Public Library of Science, vol. 13(7), pages 1-23, July.
    15. Mitsumasa Nakajima & Katsuma Inoue & Kenji Tanaka & Yasuo Kuniyoshi & Toshikazu Hashimoto & Kohei Nakajima, 2022. "Physical deep learning with biologically inspired training method: gradient-free approach for physical hardware," Nature Communications, Nature, vol. 13(1), pages 1-12, December.
    16. Ertam, Fatih, 2019. "An efficient hybrid deep learning approach for internet security," Physica A: Statistical Mechanics and its Applications, Elsevier, vol. 535(C).
    17. Angulo-Garcia, David & Torcini, Alessandro, 2014. "Stable chaos in fluctuation driven neural circuits," Chaos, Solitons & Fractals, Elsevier, vol. 69(C), pages 233-245.
    18. Tiziano D’Albis & Richard Kempter, 2017. "A single-cell spiking model for the origin of grid-cell patterns," PLOS Computational Biology, Public Library of Science, vol. 13(10), pages 1-41, October.
    19. Maxime Lemieux & Narges Karimi & Frederic Bretzner, 2024. "Functional plasticity of glutamatergic neurons of medullary reticular nuclei after spinal cord injury in mice," Nature Communications, Nature, vol. 15(1), pages 1-15, December.
    20. Pierre Yger & Kenneth D Harris, 2013. "The Convallis Rule for Unsupervised Learning in Cortical Networks," PLOS Computational Biology, Public Library of Science, vol. 9(10), pages 1-16, October.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:nat:natcom:v:13:y:2022:i:1:d:10.1038_s41467-022-29491-2. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.nature.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.