IDEAS home Printed from https://ideas.repec.org/a/nat/natcom/v16y2025i1d10.1038_s41467-025-65356-0.html
   My bibliography  Save this article

Hundred-layer photonic deep learning

Author

Listed:
  • Tiankuang Zhou

    (Tsinghua University, Department of Electronic Engineering
    Tsinghua University, Beijing National Research Center for Information Science and Technology (BNRIST))

  • Yizhou Jiang

    (Tsinghua University, Department of Electronic Engineering
    Tsinghua University, Beijing National Research Center for Information Science and Technology (BNRIST))

  • Zhihao Xu

    (Tsinghua University, Department of Electronic Engineering
    Tsinghua University, Beijing National Research Center for Information Science and Technology (BNRIST))

  • Zhiwei Xue

    (Tsinghua University, Department of Electronic Engineering
    Tsinghua University, Beijing National Research Center for Information Science and Technology (BNRIST))

  • Lu Fang

    (Tsinghua University, Department of Electronic Engineering
    Tsinghua University, Beijing National Research Center for Information Science and Technology (BNRIST)
    Tsinghua University, Institute for Brain and Cognitive Sciences)

Abstract

In the artificial intelligence era propelled by complex computational models, photonic computing represents a promising approach for energy-efficient machine learning; however, error accumulation inherent to the analog nature limits their depth to around ten layers, restricting advanced computing capabilities towards large language models (LLMs). In this study, we identify that such error accumulation arises from propagation redundancies. By introducing perturbations on-chip to decouple computational correlations, we eliminate the redundancy and develop deep photonic learning with a single-layer photonic computing (SLiM) chip that exhibits error tolerance. The SLiM chip overcomes the depth limitations of optical neural networks, allowing for error rates to be constrained across more than 200 layers, and extends spatial depth from millimeter to hundred-meter scale, enabling a three-dimensional chip cluster. We experimentally constructed a neural network with 100 layers for image classification, along with a 0.345-billion-parameter LLM with 384 layers for text generation, and a 0.192-billion-parameter LLM with 640 layers for image generation, all achieving performances comparable to ideal simulations at 10-GHz data rate. This error-tolerant single-layer chip initiates the advancement of state-of-the-art deep learning models on efficient analog computing hardware.

Suggested Citation

  • Tiankuang Zhou & Yizhou Jiang & Zhihao Xu & Zhiwei Xue & Lu Fang, 2025. "Hundred-layer photonic deep learning," Nature Communications, Nature, vol. 16(1), pages 1-13, December.
  • Handle: RePEc:nat:natcom:v:16:y:2025:i:1:d:10.1038_s41467-025-65356-0
    DOI: 10.1038/s41467-025-65356-0
    as

    Download full text from publisher

    File URL: https://www.nature.com/articles/s41467-025-65356-0
    File Function: Abstract
    Download Restriction: no

    File URL: https://libkey.io/10.1038/s41467-025-65356-0?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:nat:natcom:v:16:y:2025:i:1:d:10.1038_s41467-025-65356-0. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.nature.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.