IDEAS home Printed from https://ideas.repec.org/a/plo/pcbi00/1010219.html
   My bibliography  Save this article

Interpretable pairwise distillations for generative protein sequence models

Author

Listed:
  • Christoph Feinauer
  • Barthelemy Meynard-Piganeau
  • Carlo Lucibello

Abstract

Many different types of generative models for protein sequences have been proposed in literature. Their uses include the prediction of mutational effects, protein design and the prediction of structural properties. Neural network (NN) architectures have shown great performances, commonly attributed to the capacity to extract non-trivial higher-order interactions from the data. In this work, we analyze two different NN models and assess how close they are to simple pairwise distributions, which have been used in the past for similar problems. We present an approach for extracting pairwise models from more complex ones using an energy-based modeling framework. We show that for the tested models the extracted pairwise models can replicate the energies of the original models and are also close in performance in tasks like mutational effect prediction. In addition, we show that even simpler, factorized models often come close in performance to the original models.Author summary: Complex neural networks trained on large biological datasets have recently shown powerful capabilites in tasks like the prediction of protein structure, assessing the effect of mutations on the fitness of proteins and even designing completely novel proteins with desired characteristics. The enthralling prospect of leveraging these advances in fields like medicine and synthetic biology has created a large amount of interest in academic research and industry. The connected question of what biological insights these methods actually gain during training has, however, received less attention. In this work, we systematically investigate in how far neural networks capture information that could not be captured by simpler models. To this end, we develop a method to train simpler models to imitate more complex models, and compare their performance to the original neural network models. Surprisingly, we find that the simpler models thus trained often perform on par with the neural networks, while having a considerably easier structure. This highlights the importance of finding ways to interpret the predictions of neural networks in these fields, which could inform the creation of better models, improve methods for their assessment and ultimately also increase our understanding of the underlying biology.

Suggested Citation

  • Christoph Feinauer & Barthelemy Meynard-Piganeau & Carlo Lucibello, 2022. "Interpretable pairwise distillations for generative protein sequence models," PLOS Computational Biology, Public Library of Science, vol. 18(6), pages 1-20, June.
  • Handle: RePEc:plo:pcbi00:1010219
    DOI: 10.1371/journal.pcbi.1010219
    as

    Download full text from publisher

    File URL: https://journals.plos.org/ploscompbiol/article?id=10.1371/journal.pcbi.1010219
    Download Restriction: no

    File URL: https://journals.plos.org/ploscompbiol/article/file?id=10.1371/journal.pcbi.1010219&type=printable
    Download Restriction: no

    File URL: https://libkey.io/10.1371/journal.pcbi.1010219?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Jeanne Trinquier & Guido Uguzzoni & Andrea Pagnani & Francesco Zamponi & Martin Weigt, 2021. "Efficient generative modeling of protein sequences using simple autoregressive models," Nature Communications, Nature, vol. 12(1), pages 1-11, December.
    2. Xinqiang Ding & Zhengting Zou & Charles L. Brooks III, 2019. "Deciphering protein evolution and fitness landscapes with latent space models," Nature Communications, Nature, vol. 10(1), pages 1-13, December.
    3. Jung-Eun Shin & Adam J. Riesselman & Aaron W. Kollasch & Conor McMahon & Elana Simon & Chris Sander & Aashish Manglik & Andrew C. Kruse & Debora S. Marks, 2021. "Protein design and variant prediction using autoregressive generative models," Nature Communications, Nature, vol. 12(1), pages 1-11, December.
    4. Richard R Stein & Debora S Marks & Chris Sander, 2015. "Inferring Pairwise Interactions from Biological Data Using Maximum-Entropy Probability Models," PLOS Computational Biology, Public Library of Science, vol. 11(7), pages 1-22, July.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Kevin E. Wu & Kevin K. Yang & Rianne Berg & Sarah Alamdari & James Y. Zou & Alex X. Lu & Ava P. Amini, 2024. "Protein structure generation via folding diffusion," Nature Communications, Nature, vol. 15(1), pages 1-12, December.
    2. Fatma-Elzahraa Eid & Albert T. Chen & Ken Y. Chan & Qin Huang & Qingxia Zheng & Isabelle G. Tobey & Simon Pacouret & Pamela P. Brauer & Casey Keyes & Megan Powell & Jencilin Johnston & Binhui Zhao & K, 2024. "Systematic multi-trait AAV capsid engineering for efficient gene delivery," Nature Communications, Nature, vol. 15(1), pages 1-14, December.
    3. Chu, Xiaolei & Wang, Ziqi, 2025. "Maximum entropy-based modeling of community-level hazard responses for civil infrastructures," Reliability Engineering and System Safety, Elsevier, vol. 254(PA).
    4. Mireia Seuma & Ben Lehner & Benedetta Bolognesi, 2022. "An atlas of amyloid aggregation: the impact of substitutions, insertions, deletions and truncations on amyloid beta fibril nucleation," Nature Communications, Nature, vol. 13(1), pages 1-13, December.
    5. Andrew F Neuwald & Stephen F Altschul, 2016. "Inference of Functionally-Relevant N-acetyltransferase Residues Based on Statistical Correlations," PLOS Computational Biology, Public Library of Science, vol. 12(12), pages 1-30, December.
    6. Ziyi Zhou & Liang Zhang & Yuanxi Yu & Banghao Wu & Mingchen Li & Liang Hong & Pan Tan, 2024. "Enhancing efficiency of protein language models with minimal wet-lab data through few-shot learning," Nature Communications, Nature, vol. 15(1), pages 1-13, December.
    7. Md Tauhidul Islam & Lei Xing, 2023. "Cartography of Genomic Interactions Enables Deep Analysis of Single-Cell Expression Data," Nature Communications, Nature, vol. 14(1), pages 1-17, December.
    8. Anand Ramachandran & Steven S Lumetta & Deming Chen, 2024. "PandoGen: Generating complete instances of future SARS-CoV-2 sequences using Deep Learning," PLOS Computational Biology, Public Library of Science, vol. 20(1), pages 1-31, January.
    9. Jeffrey A. Ruffolo & Lee-Shin Chu & Sai Pooja Mahajan & Jeffrey J. Gray, 2023. "Fast, accurate antibody structure prediction from deep learning on massive set of natural antibodies," Nature Communications, Nature, vol. 14(1), pages 1-13, December.
    10. Lin Li & Esther Gupta & John Spaeth & Leslie Shing & Rafael Jaimes & Emily Engelhart & Randolph Lopez & Rajmonda S. Caceres & Tristan Bepler & Matthew E. Walsh, 2023. "Machine learning optimization of candidate antibody yields highly diverse sub-nanomolar affinity antibody libraries," Nature Communications, Nature, vol. 14(1), pages 1-12, December.
    11. Evgenii Lobzaev & Michael A. Herrera & Martyna Kasprzyk & Giovanni Stracquadanio, 2024. "Protein engineering using variational free energy approximation," Nature Communications, Nature, vol. 15(1), pages 1-11, December.
    12. Evgenii Lobzaev & Giovanni Stracquadanio, 2024. "Dirichlet latent modelling enables effective learning and sampling of the functional protein design space," Nature Communications, Nature, vol. 15(1), pages 1-11, December.
    13. Erik Aurell, 2016. "The Maximum Entropy Fallacy Redux?," PLOS Computational Biology, Public Library of Science, vol. 12(5), pages 1-7, May.
    14. Karol Buda & Charlotte M. Miton & Nobuhiko Tokuriki, 2023. "Pervasive epistasis exposes intramolecular networks in adaptive enzyme evolution," Nature Communications, Nature, vol. 14(1), pages 1-12, December.
    15. Nicki Skafte Detlefsen & Søren Hauberg & Wouter Boomsma, 2022. "Learning meaningful representations of protein sequences," Nature Communications, Nature, vol. 13(1), pages 1-12, December.
    16. Emily K. Makowski & Patrick C. Kinnunen & Jie Huang & Lina Wu & Matthew D. Smith & Tiexin Wang & Alec A. Desai & Craig N. Streu & Yulei Zhang & Jennifer M. Zupancic & John S. Schardt & Jennifer J. Lin, 2022. "Co-optimization of therapeutic antibody affinity and specificity using machine learning models that generalize to novel mutational space," Nature Communications, Nature, vol. 13(1), pages 1-14, December.
    17. Haohuai He & Bing He & Lei Guan & Yu Zhao & Feng Jiang & Guanxing Chen & Qingge Zhu & Calvin Yu-Chian Chen & Ting Li & Jianhua Yao, 2024. "De novo generation of SARS-CoV-2 antibody CDRH3 with a pre-trained generative large language model," Nature Communications, Nature, vol. 15(1), pages 1-19, December.
    18. Erika Erickson & Japheth E. Gado & Luisana Avilán & Felicia Bratti & Richard K. Brizendine & Paul A. Cox & Raj Gill & Rosie Graham & Dong-Jin Kim & Gerhard König & William E. Michener & Saroj Poudel &, 2022. "Sourcing thermotolerant poly(ethylene terephthalate) hydrolase scaffolds from natural diversity," Nature Communications, Nature, vol. 13(1), pages 1-15, December.
    19. Jonathan Parkinson & Ryan Hard & Wei Wang, 2023. "The RESP AI model accelerates the identification of tight-binding antibodies," Nature Communications, Nature, vol. 14(1), pages 1-18, December.
    20. Wenkang Wang & Yunyan Shuai & Min Zeng & Wei Fan & Min Li, 2025. "DPFunc: accurately predicting protein function via deep learning with domain-guided structure information," Nature Communications, Nature, vol. 16(1), pages 1-13, December.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:plo:pcbi00:1010219. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: ploscompbiol (email available below). General contact details of provider: https://journals.plos.org/ploscompbiol/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.