IDEAS home Printed from https://ideas.repec.org/a/nat/nature/v601y2022i7894d10.1038_s41586-021-04223-6.html
   My bibliography  Save this article

Deep physical neural networks trained with backpropagation

Author

Listed:
  • Logan G. Wright

    (Cornell University
    NTT Research, Inc.)

  • Tatsuhiro Onodera

    (Cornell University
    NTT Research, Inc.)

  • Martin M. Stein

    (Cornell University)

  • Tianyu Wang

    (Cornell University)

  • Darren T. Schachter

    (Cornell University)

  • Zoey Hu

    (Cornell University)

  • Peter L. McMahon

    (Cornell University)

Abstract

Deep-learning models have become pervasive tools in science and engineering. However, their energy requirements now increasingly limit their scalability1. Deep-learning accelerators2–9 aim to perform deep learning energy-efficiently, usually targeting the inference phase and often by exploiting physical substrates beyond conventional electronics. Approaches so far10–22 have been unable to apply the backpropagation algorithm to train unconventional novel hardware in situ. The advantages of backpropagation have made it the de facto training method for large-scale neural networks, so this deficiency constitutes a major impediment. Here we introduce a hybrid in situ–in silico algorithm, called physics-aware training, that applies backpropagation to train controllable physical systems. Just as deep learning realizes computations with deep neural networks made from layers of mathematical functions, our approach allows us to train deep physical neural networks made from layers of controllable physical systems, even when the physical layers lack any mathematical isomorphism to conventional artificial neural network layers. To demonstrate the universality of our approach, we train diverse physical neural networks based on optics, mechanics and electronics to experimentally perform audio and image classification tasks. Physics-aware training combines the scalability of backpropagation with the automatic mitigation of imperfections and noise achievable with in situ algorithms. Physical neural networks have the potential to perform machine learning faster and more energy-efficiently than conventional electronic processors and, more broadly, can endow physical systems with automatically designed physical functionalities, for example, for robotics23–26, materials27–29 and smart sensors30–32.

Suggested Citation

  • Logan G. Wright & Tatsuhiro Onodera & Martin M. Stein & Tianyu Wang & Darren T. Schachter & Zoey Hu & Peter L. McMahon, 2022. "Deep physical neural networks trained with backpropagation," Nature, Nature, vol. 601(7894), pages 549-555, January.
  • Handle: RePEc:nat:nature:v:601:y:2022:i:7894:d:10.1038_s41586-021-04223-6
    DOI: 10.1038/s41586-021-04223-6
    as

    Download full text from publisher

    File URL: https://www.nature.com/articles/s41586-021-04223-6
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1038/s41586-021-04223-6?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:nat:nature:v:601:y:2022:i:7894:d:10.1038_s41586-021-04223-6. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.nature.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.