IDEAS home Printed from https://ideas.repec.org/a/gam/jeners/v18y2025i9p2286-d1646200.html
   My bibliography  Save this article

Transformers and Long Short-Term Memory Transfer Learning for GenIV Reactor Temperature Time Series Forecasting

Author

Listed:
  • Stella Pantopoulou

    (Nuclear Science and Engineering Division, Argonne National Laboratory, Argonne, IL 60439, USA
    School of Nuclear Engineering, Purdue University, West Lafayette, IN 47906, USA)

  • Anthonie Cilliers

    (Kairos Power, Alameda, CA 94501, USA)

  • Lefteri H. Tsoukalas

    (School of Nuclear Engineering, Purdue University, West Lafayette, IN 47906, USA)

  • Alexander Heifetz

    (Nuclear Science and Engineering Division, Argonne National Laboratory, Argonne, IL 60439, USA)

Abstract

Automated monitoring of the coolant temperature can enable autonomous operation of generation IV reactors (GenIV), thus reducing their operating and maintenance costs. Automation can be accomplished with machine learning (ML) models trained on historical sensor data. However, the performance of ML usually depends on the availability of large amount of training data, which is difficult to obtain for GenIV, as this technology is still under development. We propose the use of transfer learning (TL), which involves utilizing knowledge across different domains, to compensate for this lack of training data. TL can be used to create pre-trained ML models with data from small-scale research facilities, which can then be fine-tuned to monitor GenIV reactors. In this work, we develop pre-trained Transformer and long short-term memory (LSTM) networks by training them on temperature measurements from thermal hydraulic flow loops operating with water and Galinstan fluids at room temperature at Argonne National Laboratory. The pre-trained models are then fine-tuned and re-trained with minimal additional data to perform predictions of the time series of high temperature measurements obtained from the Engineering Test Unit (ETU) at Kairos Power. The performance of the LSTM and Transformer networks is investigated by varying the size of the lookback window and forecast horizon. The results of this study show that LSTM networks have lower prediction errors than Transformers, but LSTM errors increase more rapidly with increasing lookback window size and forecast horizon compared to the Transformer errors.

Suggested Citation

  • Stella Pantopoulou & Anthonie Cilliers & Lefteri H. Tsoukalas & Alexander Heifetz, 2025. "Transformers and Long Short-Term Memory Transfer Learning for GenIV Reactor Temperature Time Series Forecasting," Energies, MDPI, vol. 18(9), pages 1-18, April.
  • Handle: RePEc:gam:jeners:v:18:y:2025:i:9:p:2286-:d:1646200
    as

    Download full text from publisher

    File URL: https://www.mdpi.com/1996-1073/18/9/2286/pdf
    Download Restriction: no

    File URL: https://www.mdpi.com/1996-1073/18/9/2286/
    Download Restriction: no
    ---><---

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:gam:jeners:v:18:y:2025:i:9:p:2286-:d:1646200. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: MDPI Indexing Manager (email available below). General contact details of provider: https://www.mdpi.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.