IDEAS home Printed from https://ideas.repec.org/a/nat/natcom/v16y2025i1d10.1038_s41467-025-60252-z.html
   My bibliography  Save this article

An end-to-end attention-based approach for learning on graphs

Author

Listed:
  • David Buterez

    (University of Cambridge)

  • Jon Paul Janet

    (BioPharmaceuticals R&D, AstraZeneca)

  • Dino Oglic

    (BioPharmaceuticals R&D, AstraZeneca)

  • Pietro Liò

    (University of Cambridge)

Abstract

There has been a recent surge in transformer-based architectures for learning on graphs, mainly motivated by attention as an effective learning mechanism and the desire to supersede the hand-crafted operators characteristic of message passing schemes. However, concerns over their empirical effectiveness, scalability, and complexity of the pre-processing steps have been raised, especially in relation to much simpler graph neural networks that typically perform on par with them across a wide range of benchmarks. To address these shortcomings, we consider graphs as sets of edges and propose a purely attention-based approach consisting of an encoder and an attention pooling mechanism. The encoder vertically interleaves masked and vanilla self-attention modules to learn an effective representation of edges while allowing for tackling possible misspecifications in input graphs. Despite its simplicity, the approach outperforms fine-tuned message passing baselines and recently proposed transformer-based methods on more than 70 node and graph-level tasks, including challenging long-range benchmarks. Moreover, we demonstrate state-of-the-art performance across different tasks, ranging from molecular to vision graphs, and heterophilous node classification. The approach also outperforms graph neural networks and transformers in transfer learning settings and scales much better than alternatives with a similar performance level or expressive power.

Suggested Citation

  • David Buterez & Jon Paul Janet & Dino Oglic & Pietro Liò, 2025. "An end-to-end attention-based approach for learning on graphs," Nature Communications, Nature, vol. 16(1), pages 1-16, December.
  • Handle: RePEc:nat:natcom:v:16:y:2025:i:1:d:10.1038_s41467-025-60252-z
    DOI: 10.1038/s41467-025-60252-z
    as

    Download full text from publisher

    File URL: https://www.nature.com/articles/s41467-025-60252-z
    File Function: Abstract
    Download Restriction: no

    File URL: https://libkey.io/10.1038/s41467-025-60252-z?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Shuqi Lu & Zhifeng Gao & Di He & Linfeng Zhang & Guolin Ke, 2024. "Data-driven quantum chemical property prediction leveraging 3D conformations with Uni-Mol+," Nature Communications, Nature, vol. 15(1), pages 1-11, December.
    2. Amil Merchant & Simon Batzner & Samuel S. Schoenholz & Muratahan Aykol & Gowoon Cheon & Ekin Dogus Cubuk, 2023. "Scaling deep learning for materials discovery," Nature, Nature, vol. 624(7990), pages 80-85, December.
    3. David Buterez & Jon Paul Janet & Steven J. Kiddle & Dino Oglic & Pietro Lió, 2024. "Transfer learning with graph neural networks for improved molecular property prediction in the multi-fidelity setting," Nature Communications, Nature, vol. 15(1), pages 1-18, December.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Gaétan de Rassenfosse & Adam B. Jaffe & Joel Waldfogel, 2025. "Intellectual Property and Creative Machines," Entrepreneurship and Innovation Policy and the Economy, University of Chicago Press, vol. 4(1), pages 47-79.
    2. Wang, Zixuan & Chen, Zijian & Wang, Boyuan & Wu, Chuang & Zhou, Chao & Peng, Yang & Zhang, Xinyu & Ni, Zongming & Chung, Chi-yung & Chan, Ching-chuen & Yang, Jian & Zhao, Haitao, 2025. "Digital manufacturing of perovskite materials and solar cells," Applied Energy, Elsevier, vol. 377(PB).
    3. Keke Song & Rui Zhao & Jiahui Liu & Yanzhou Wang & Eric Lindgren & Yong Wang & Shunda Chen & Ke Xu & Ting Liang & Penghua Ying & Nan Xu & Zhiqiang Zhao & Jiuyang Shi & Junjie Wang & Shuang Lyu & Zezhu, 2024. "General-purpose machine-learned potential for 16 elemental metals and their alloys," Nature Communications, Nature, vol. 15(1), pages 1-15, December.
    4. Daniel Schwalbe-Koda & Sebastien Hamel & Babak Sadigh & Fei Zhou & Vincenzo Lordi, 2025. "Model-free estimation of completeness, uncertainties, and outliers in atomistic machine learning using information theory," Nature Communications, Nature, vol. 16(1), pages 1-13, December.
    5. Luis M. Antunes & Keith T. Butler & Ricardo Grau-Crespo, 2024. "Crystal structure generation with autoregressive large language modeling," Nature Communications, Nature, vol. 15(1), pages 1-16, December.
    6. Jingbo Liu & Fan Jiang & Shinichi Tashiro & Shujun Chen & Manabu Tanaka, 2025. "A physics-informed and data-driven framework for robotic welding in manufacturing," Nature Communications, Nature, vol. 16(1), pages 1-18, December.
    7. Grigorii Skorupskii & Fabio Orlandi & Iñigo Robredo & Milena Jovanovic & Rinsuke Yamada & Fatmagül Katmer & Maia G. Vergniory & Pascal Manuel & Max Hirschberger & Leslie M. Schoop, 2024. "Designing giant Hall response in layered topological semimetals," Nature Communications, Nature, vol. 15(1), pages 1-11, December.
    8. David Buterez & Jon Paul Janet & Steven J. Kiddle & Dino Oglic & Pietro Lió, 2024. "Transfer learning with graph neural networks for improved molecular property prediction in the multi-fidelity setting," Nature Communications, Nature, vol. 15(1), pages 1-18, December.
    9. Jianbo Qiao & Junru Jin & Ding Wang & Saisai Teng & Junyu Zhang & Xuetong Yang & Yuhang Liu & Yu Wang & Lizhen Cui & Quan Zou & Ran Su & Leyi Wei, 2025. "A self-conformation-aware pre-training framework for molecular property prediction with substructure interpretability," Nature Communications, Nature, vol. 16(1), pages 1-16, December.
    10. Junwu Chen & Xu Huang & Cheng Hua & Yulian He & Philippe Schwaller, 2025. "A multi-modal transformer for predicting global minimum adsorption energy," Nature Communications, Nature, vol. 16(1), pages 1-12, December.
    11. Ziduo Yang & Yi-Ming Zhao & Xian Wang & Xiaoqing Liu & Xiuying Zhang & Yifan Li & Qiujie Lv & Calvin Yu-Chian Chen & Lei Shen, 2024. "Scalable crystal structure relaxation using an iteration-free deep generative model with uncertainty quantification," Nature Communications, Nature, vol. 15(1), pages 1-15, December.
    12. Chang Jiang & Hongyuan He & Hongquan Guo & Xiaoxin Zhang & Qingyang Han & Yanhong Weng & Xianzhu Fu & Yinlong Zhu & Ning Yan & Xin Tu & Yifei Sun, 2024. "Transfer learning guided discovery of efficient perovskite oxide for alkaline water oxidation," Nature Communications, Nature, vol. 15(1), pages 1-15, December.
    13. Petersen, Alexander Michael & Arroyave, Felber J. & Pammolli, Fabio, 2025. "The disruption index suffers from citation inflation: Re-analysis of temporal CD trend and relationship with team size reveal discrepancies," Journal of Informetrics, Elsevier, vol. 19(1).

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:nat:natcom:v:16:y:2025:i:1:d:10.1038_s41467-025-60252-z. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.nature.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.