IDEAS home Printed from https://ideas.repec.org/a/eee/appene/v402y2026ipbs0306261925016368.html

Unified system intelligence: Learning energy strategies for optimizing operations, maintenance, and market outcomes

Author

Listed:
  • Pylorof, Dimitrios
  • Garcia, Humberto E.

Abstract

In view of a multitude of efforts to realize technologically- and economically-viable, long-term sustainable energy technologies and solutions, we develop a Reinforcement Learning (RL) approach to intelligent energy market bidding and broader plant operations and maintenance (O&M). Our approach is cognizant of current and future performance, maintenance, and economic aspects of the supervised system and its operational environment. Regardless of their energy generation modality (e.g., fossil, nuclear), the relatively-centralized systems that will complement distributed renewable energy in contemporary power grids will not only be subject to the complexity and typical operational, maintenance, and economic nuances of any complex production facility, but also be based on comparatively new technologies that yet need to prove competitive in increasingly tighter markets. The manner in which any production facility is run can have a profound effect on its operational and maintenance costs, whereas the multi-party bidding and market dynamics existing in contemporary energy markets largely dictate the operational envelopes and accrued costs for the entire facility. Our methodology establishes a long-horizon-aware intelligent feedback loop that bids strategically in day-ahead energy markets and supervises other operations and maintenance aspects (e.g., maintenance action selection and scheduling, slowdown or downtime considerations) in a way that maximizes plant profitability by increasing revenues while controlling and distributing maintenance costs. The foundations of our approach are based not only on RL techniques to periodically (re-)construct stochastic, strongly-coupled bidding and plant supervision policies with receding reasoning horizons, but also on operationally-leaning learning and inference RL workflows. In addition to establishing the interfaces and mechanics of our RL agent, we prototype key aspects of the underlying techno-economic environment and relevant algorithmic and numerical tools and approximations that enable the sought-after reasoning. In contrast to isolated bidding algorithms operating under the premise of statistically- or offline-computed marginal costs disconnected from actual, day-to-day operations in the particular operational environment, our techniques learn to address the coupled problem of bidding and plant supervision holistically, within any particular operational environment defined by the local energy grid and market, as well as in view of their probabilistic behavior and future evolution.

Suggested Citation

  • Pylorof, Dimitrios & Garcia, Humberto E., 2026. "Unified system intelligence: Learning energy strategies for optimizing operations, maintenance, and market outcomes," Applied Energy, Elsevier, vol. 402(PB).
  • Handle: RePEc:eee:appene:v:402:y:2026:i:pb:s0306261925016368
    DOI: 10.1016/j.apenergy.2025.126906
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0306261925016368
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.apenergy.2025.126906?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to

    for a different version of it.

    References listed on IDEAS

    as
    1. Esmaeili Aliabadi, Danial & Chan, Katrina, 2022. "The emerging threat of artificial intelligence on competition in liberalized electricity markets: A deep Q-network approach," Applied Energy, Elsevier, vol. 325(C).
    2. Pinciroli, Luca & Baraldi, Piero & Ballabio, Guido & Compare, Michele & Zio, Enrico, 2022. "Optimization of the Operation and Maintenance of renewable energy systems by Deep Reinforcement Learning," Renewable Energy, Elsevier, vol. 183(C), pages 752-763.
    3. Nielsen, Steffen & Sorknæs, Peter & Østergaard, Poul Alberg, 2011. "Electricity market auction settings in a future Danish electricity system with a high penetration of renewable energy sources – A comparison of marginal pricing and pay-as-bid," Energy, Elsevier, vol. 36(7), pages 4434-4444.
    4. Zhao, Yunfei & Smidts, Carol, 2022. "Reinforcement learning for adaptive maintenance policy optimization under imperfect knowledge of the system degradation model and partial observability of system states," Reliability Engineering and System Safety, Elsevier, vol. 224(C).
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Zhang, Qin & Liu, Yu & Xiang, Yisha & Xiahou, Tangfan, 2024. "Reinforcement learning in reliability and maintenance optimization: A tutorial," Reliability Engineering and System Safety, Elsevier, vol. 251(C).
    2. Østergaard, P.A. & Lund, H. & Thellufsen, J.Z. & Sorknæs, P. & Mathiesen, B.V., 2022. "Review and validation of EnergyPLAN," Renewable and Sustainable Energy Reviews, Elsevier, vol. 168(C).
    3. Mazaher Haji Bashi & Gholamreza Yousefi & Claus Leth Bak & Jayakrishnan Radhakrishna Pillai, 2016. "Long Term Expected Revenue of Wind Farms Considering the Bidding Admission Uncertainty," Energies, MDPI, vol. 9(11), pages 1-17, November.
    4. Rancilio, G. & Rossi, A. & Falabretti, D. & Galliani, A. & Merlo, M., 2022. "Ancillary services markets in europe: Evolution and regulatory trade-offs," Renewable and Sustainable Energy Reviews, Elsevier, vol. 154(C).
    5. Zhang, Huixian & Wei, Xiukun & Liu, Zhiqiang & Ding, Yaning & Guan, Qingluan, 2025. "Condition-based maintenance for multi-state systems with prognostic and deep reinforcement learning," Reliability Engineering and System Safety, Elsevier, vol. 255(C).
    6. Østergaard, Poul Alberg & Andersen, Anders N., 2018. "Economic feasibility of booster heat pumps in heat pump-based district heating systems," Energy, Elsevier, vol. 155(C), pages 921-929.
    7. Ma, Tao & Østergaard, Poul Alberg & Lund, Henrik & Yang, Hongxing & Lu, Lin, 2014. "An energy system model for Hong Kong in 2020," Energy, Elsevier, vol. 68(C), pages 301-310.
    8. Mathiesen, B.V. & Lund, H. & Connolly, D. & Wenzel, H. & Østergaard, P.A. & Möller, B. & Nielsen, S. & Ridjan, I. & Karnøe, P. & Sperling, K. & Hvelplund, F.K., 2015. "Smart Energy Systems for coherent 100% renewable energy and transport solutions," Applied Energy, Elsevier, vol. 145(C), pages 139-154.
    9. Esmaeili Aliabadi, Danial & Wulff, Niklas & Lehneis, Reinhold & Sadr, Mohammad & Gutjahr, Sandra & Reutter, Felix Jonas & Jordan, Matthias & Lehmann, Paul & Thrän, Daniela, 2025. "Climate change may impair the transition to a fully renewable energy system: A German case study," Energy, Elsevier, vol. 338(C).
    10. Ribeiro, Vitor Miguel & Soares, Isabel, 2025. "A primer on verti-zontally differentiated peer-to-peer energy intraday trading platforms with and without customization," Energy, Elsevier, vol. 320(C).
    11. Tseremoglou, Iordanis & Santos, Bruno F., 2024. "Condition-Based Maintenance scheduling of an aircraft fleet under partial observability: A Deep Reinforcement Learning approach," Reliability Engineering and System Safety, Elsevier, vol. 241(C).
    12. Blumberga, Dagnija & Blumberga, Andra & Barisa, Aiga & Rosa, Marika & Lauka, Dace, 2016. "Modelling the Latvian power market to evaluate its environmental long-term performance," Applied Energy, Elsevier, vol. 162(C), pages 1593-1600.
    13. Abdulla, Hind & Sleptchenko, Andrei & Nayfeh, Ammar, 2024. "Photovoltaic systems operation and maintenance: A review and future directions," Renewable and Sustainable Energy Reviews, Elsevier, vol. 195(C).
    14. Zheng, Yi & Wang, Jian & Wang, Chengmin & Huang, Chunyi & Yang, Jingfei & Xie, Ning, 2025. "Strategic bidding of wind farms in medium-to-long-term rolling transactions: A bi-level multi-agent deep reinforcement learning approach," Applied Energy, Elsevier, vol. 383(C).
    15. Zhao, Yunfei & Vaddi, Pavan Kumar & Pietrykowski, Michael & Khafizov, Marat & Smidts, Carol, 2023. "An empirical study of the added value of the sequential learning of model parameters to industrial system health monitoring," Reliability Engineering and System Safety, Elsevier, vol. 240(C).
    16. Mikhail, Mina & Ouali, Mohamed-Salah & Yacout, Soumaya, 2024. "A data-driven methodology with a nonparametric reliability method for optimal condition-based maintenance strategies," Reliability Engineering and System Safety, Elsevier, vol. 241(C).
    17. Pinciroli, Luca & Baraldi, Piero & Zio, Enrico, 2023. "Maintenance optimization in industry 4.0," Reliability Engineering and System Safety, Elsevier, vol. 234(C).
    18. He, Rui & Tian, Zhigang & Wang, Yifei & Zuo, Mingjian & Guo, Ziwei, 2023. "Condition-based maintenance optimization for multi-component systems considering prognostic information and degraded working efficiency," Reliability Engineering and System Safety, Elsevier, vol. 234(C).
    19. Hao, Zhaojun & Di Maio, Francesco & Zio, Enrico, 2023. "A sequential decision problem formulation and deep reinforcement learning solution of the optimization of O&M of cyber-physical energy systems (CPESs) for reliable and safe power production and supply," Reliability Engineering and System Safety, Elsevier, vol. 235(C).
    20. Hendradewa, Andrie Pasca & Yin, Shen, 2025. "Comparative analysis of offshore wind turbine blade maintenance: RL-based and classical strategies for sustainable approach," Reliability Engineering and System Safety, Elsevier, vol. 253(C).

    More about this item

    Keywords

    ;
    ;
    ;
    ;
    ;
    ;
    ;
    ;

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:appene:v:402:y:2026:i:pb:s0306261925016368. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/wps/find/journaldescription.cws_home/405891/description#description .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.