Author
Listed:
- Yannis Assael
(Google DeepMind)
- Thea Sommerschield
(University of Nottingham)
- Alison Cooley
(University of Warwick)
- Brendan Shillingford
(Google DeepMind)
- John Pavlopoulos
(Athens University of Economics and Business)
- Priyanka Suresh
(Google DeepMind)
- Bailey Herms
(Google)
- Justin Grayston
(Google)
- Benjamin Maynard
(Google)
- Nicholas Dietrich
(Google DeepMind)
- Robbe Wulgaert
(Sint-Lievenscollege)
- Jonathan Prag
(University of Oxford)
- Alex Mullen
(University of Nottingham)
- Shakir Mohamed
(Google DeepMind)
Abstract
Human history is born in writing. Inscriptions are among the earliest written forms, and offer direct insights into the thought, language and history of ancient civilizations. Historians capture these insights by identifying parallels—inscriptions with shared phrasing, function or cultural setting—to enable the contextualization of texts within broader historical frameworks, and perform key tasks such as restoration and geographical or chronological attribution1. However, current digital methods are restricted to literal matches and narrow historical scopes. Here we introduce Aeneas, a generative neural network for contextualizing ancient texts. Aeneas retrieves textual and contextual parallels, leverages visual inputs, handles arbitrary-length text restoration, and advances the state of the art in key tasks. To evaluate its impact, we conduct a large study with historians using outputs from Aeneas as research starting points. The historians find the parallels retrieved by Aeneas to be useful research starting points in 90% of cases, improving their confidence in key tasks by 44%. Restoration and geographical attribution tasks yielded superior results when historians were paired with Aeneas, outperforming both humans and artificial intelligence alone. For dating, Aeneas achieved a 13-year distance from ground-truth ranges. We demonstrate Aeneas’ contribution to historical workflows through analysis of key traits in the renowned Roman inscription Res Gestae Divi Augusti, showing how integrating science and humanities can create transformative tools to assist historians and advance our understanding of the past.
Suggested Citation
Yannis Assael & Thea Sommerschield & Alison Cooley & Brendan Shillingford & John Pavlopoulos & Priyanka Suresh & Bailey Herms & Justin Grayston & Benjamin Maynard & Nicholas Dietrich & Robbe Wulgaert , 2025.
"Contextualizing ancient texts with generative neural networks,"
Nature, Nature, vol. 645(8079), pages 141-147, September.
Handle:
RePEc:nat:nature:v:645:y:2025:i:8079:d:10.1038_s41586-025-09292-5
DOI: 10.1038/s41586-025-09292-5
Download full text from publisher
As the access to this document is restricted, you may want to
for a different version of it.
Corrections
All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:nat:nature:v:645:y:2025:i:8079:d:10.1038_s41586-025-09292-5. See general information about how to correct material in RePEc.
If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.
We have no bibliographic references for this item. You can help adding them by using this form .
If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.nature.com .
Please note that corrections may take a couple of weeks to filter through
the various RePEc services.