IDEAS home Printed from https://ideas.repec.org/a/eee/appene/v379y2025ics0306261924022141.html
   My bibliography  Save this article

Enhancing cyber-resilience in integrated energy system scheduling with demand response using deep reinforcement learning

Author

Listed:
  • Li, Yang
  • Ma, Wenjie
  • Li, Yuanzheng
  • Li, Sen
  • Chen, Zhe
  • Shahidehpour, Mohammad

Abstract

Optimally scheduling multi-energy flow is an effective method to utilize renewable energy sources (RES) and improve the stability and economy of integrated energy systems (IES). However, the stable demand-supply of IES faces challenges from uncertainties that arise from RES and loads, as well as the increasing impact of cyber-attacks with advanced information and communication technologies adoption. To address these challenges, this paper proposes an innovative model-free resilience scheduling method based on state-adversarial deep reinforcement learning (DRL) for integrated demand response (IDR)-enabled IES. The proposed method designs an IDR program to explore the interaction ability of electricity-gas-heat flexible loads. Additionally, the state-adversarial Markov decision process (SA-MDP) model characterizes the energy scheduling problem of IES under cyber-attack, incorporating cyber-attacks as adversaries directly into the scheduling process. The state-adversarial soft actor–critic (SA-SAC) algorithm is proposed to mitigate the impact of cyber-attacks on the scheduling strategy, integrating adversarial training into the learning process to against cyber-attacks. Simulation results demonstrate that our method is capable of adequately addressing the uncertainties resulting from RES and loads, mitigating the impact of cyber-attacks on the scheduling strategy, and ensuring a stable demand supply for various energy sources. Moreover, the proposed method demonstrates resilience against cyber-attacks. Compared to the original soft actor–critic (SAC) algorithm, it achieves a 10% improvement in economic performance under cyber-attack scenarios.

Suggested Citation

  • Li, Yang & Ma, Wenjie & Li, Yuanzheng & Li, Sen & Chen, Zhe & Shahidehpour, Mohammad, 2025. "Enhancing cyber-resilience in integrated energy system scheduling with demand response using deep reinforcement learning," Applied Energy, Elsevier, vol. 379(C).
  • Handle: RePEc:eee:appene:v:379:y:2025:i:c:s0306261924022141
    DOI: 10.1016/j.apenergy.2024.124831
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0306261924022141
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.apenergy.2024.124831?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to

    for a different version of it.

    References listed on IDEAS

    as
    1. Hua, Haochen & Qin, Yuchao & Hao, Chuantong & Cao, Junwei, 2019. "Optimal energy management strategies for energy Internet via deep reinforcement learning approach," Applied Energy, Elsevier, vol. 239(C), pages 598-609.
    2. Liu, Liu & Wang, Dan & Hou, Kai & Jia, Hong-jie & Li, Si-yuan, 2020. "Region model and application of regional integrated energy system security analysis," Applied Energy, Elsevier, vol. 260(C).
    3. Li, Yang & Bu, Fanjin & Li, Yuanzheng & Long, Chao, 2023. "Optimal scheduling of island integrated energy systems considering multi-uncertainties and hydrothermal simultaneous transmission: A deep reinforcement learning approach," Applied Energy, Elsevier, vol. 333(C).
    4. Mao, Ning & Hao, Jingyu & He, Tianbiao & Song, Mengjie & Xu, Yingjie & Deng, Shiming, 2019. "PMV-based dynamic optimization of energy consumption for a residential task/ambient air conditioning system in different climate zones," Renewable Energy, Elsevier, vol. 142(C), pages 41-54.
    5. Ding, Shixing & Gu, Wei & Lu, Shuai & Yu, Ruizhi & Sheng, Lina, 2022. "Cyber-attack against heating system in integrated energy systems: Model and propagation mechanism," Applied Energy, Elsevier, vol. 311(C).
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Li, Zhongping & Xiang, Yue & Liu, Junyong, 2025. "Forecasting error-aware optimal dispatch of wind-storage integrated power systems: A soft-actor-critic deep reinforcement learning approach," Energy, Elsevier, vol. 318(C).
    2. Dongnyok Shim, 2025. "Quantifying Social Benefits of Virtual Power Plants (VPPs) in South Korea: Contingent Valuation Method," Energies, MDPI, vol. 18(12), pages 1-16, June.
    3. Ramul, Ali Rashid & Shahraki, Atefeh Salimi & Bachache, Nasseer K. & Sadeghi, Ramtin, 2025. "Cyberspace enhancement of electric vehicle charging stations in smart grids based on detection and resilience measures against hybrid cyberattacks: A multi-agent deep reinforcement learning approach," Energy, Elsevier, vol. 325(C).
    4. Li, Yang & Zhang, Shitu & Li, Yuanzheng, 2025. "AI-enhanced resilience in power systems: Adversarial deep learning for robust short-term voltage stability assessment under cyber-attacks," Chaos, Solitons & Fractals, Elsevier, vol. 196(C).

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Wen Fan & Qing Liu & Mingyu Wang, 2021. "Bi-Level Multi-Objective Optimization Scheduling for Regional Integrated Energy Systems Based on Quantum Evolutionary Algorithm," Energies, MDPI, vol. 14(16), pages 1-15, August.
    2. Yin, Linfei & Xiong, Yi, 2024. "Incremental learning user profile and deep reinforcement learning for managing building energy in heating water," Energy, Elsevier, vol. 313(C).
    3. Zhu, Jiaoyiling & Hu, Weihao & Xu, Xiao & Liu, Haoming & Pan, Li & Fan, Haoyang & Zhang, Zhenyuan & Chen, Zhe, 2022. "Optimal scheduling of a wind energy dominated distribution network via a deep reinforcement learning approach," Renewable Energy, Elsevier, vol. 201(P1), pages 792-801.
    4. Gao, Xianhui & Wang, Sheng & Sun, Ying & Zhai, Junyi & Chen, Nan & Zhang, Xiao-Ping, 2024. "Low-carbon energy scheduling for integrated energy systems considering offshore wind power hydrogen production and dynamic hydrogen doping strategy," Applied Energy, Elsevier, vol. 376(PA).
    5. Li, Yang & Wang, Bin & Yang, Zhen & Li, Jiazheng & Chen, Chen, 2022. "Hierarchical stochastic scheduling of multi-community integrated energy systems in uncertain environments via Stackelberg game," Applied Energy, Elsevier, vol. 308(C).
    6. Zhang, Yijie & Ma, Tao & Elia Campana, Pietro & Yamaguchi, Yohei & Dai, Yanjun, 2020. "A techno-economic sizing method for grid-connected household photovoltaic battery systems," Applied Energy, Elsevier, vol. 269(C).
    7. Ahmad, Tanveer & Chen, Huanxin, 2019. "Deep learning for multi-scale smart energy forecasting," Energy, Elsevier, vol. 175(C), pages 98-112.
    8. Zeyue Sun & Mohsen Eskandari & Chaoran Zheng & Ming Li, 2022. "Handling Computation Hardness and Time Complexity Issue of Battery Energy Storage Scheduling in Microgrids by Deep Reinforcement Learning," Energies, MDPI, vol. 16(1), pages 1-20, December.
    9. Kandasamy, Jeevitha & Ramachandran, Rajeswari & Veerasamy, Veerapandiyan & Irudayaraj, Andrew Xavier Raj, 2024. "Distributed leader-follower based adaptive consensus control for networked microgrids," Applied Energy, Elsevier, vol. 353(PA).
    10. Zhang, Chenwei & Wang, Ying & Zheng, Tao & Zhang, Kaifeng, 2024. "Complex network theory-based optimization for enhancing resilience of large-scale multi-energy System11The short version of the paper was presented at CUE2023. This paper is a substantial extension of," Applied Energy, Elsevier, vol. 370(C).
    11. Fathy, Ahmed, 2023. "Bald eagle search optimizer-based energy management strategy for microgrid with renewable sources and electric vehicles," Applied Energy, Elsevier, vol. 334(C).
    12. Wu, Yuankai & Tan, Huachun & Peng, Jiankun & Zhang, Hailong & He, Hongwen, 2019. "Deep reinforcement learning of energy management with continuous control strategy and traffic information for a series-parallel plug-in hybrid electric bus," Applied Energy, Elsevier, vol. 247(C), pages 454-466.
    13. Buyak, Nadia & Deshko, Valeriy & Bilous, Inna & Pavlenko, Anatoliy & Sapunov, Anatoliy & Biriukov, Dmytro, 2023. "Dynamic interdependence of comfortable thermal conditions and energy efficiency increase in a nursery school building for heating and cooling period," Energy, Elsevier, vol. 283(C).
    14. Chen, Dongyu & Sun, Qun Zhou & Qiao, Yiyuan, 2025. "Defending against cyber-attacks in building HVAC systems through energy performance evaluation using a physics-informed dynamic Bayesian network (PIDBN)," Energy, Elsevier, vol. 322(C).
    15. Zhu, Ziqing & Hu, Ze & Chan, Ka Wing & Bu, Siqi & Zhou, Bin & Xia, Shiwei, 2023. "Reinforcement learning in deregulated energy market: A comprehensive review," Applied Energy, Elsevier, vol. 329(C).
    16. Tahir, Muhammad Faizan & Haoyong, Chen & Guangze, Han, 2022. "Evaluating individual heating alternatives in integrated energy system by employing energy and exergy analysis," Energy, Elsevier, vol. 249(C).
    17. Qi, Chunyang & Zhu, Yiwen & Song, Chuanxue & Yan, Guangfu & Xiao, Feng & Da wang, & Zhang, Xu & Cao, Jingwei & Song, Shixin, 2022. "Hierarchical reinforcement learning based energy management strategy for hybrid electric vehicle," Energy, Elsevier, vol. 238(PA).
    18. Ding, Shixing & Gu, Wei & Lu, Shuai & Yu, Ruizhi & Sheng, Lina, 2022. "Cyber-attack against heating system in integrated energy systems: Model and propagation mechanism," Applied Energy, Elsevier, vol. 311(C).
    19. Akhil Joseph & Patil Balachandra, 2020. "Energy Internet, the Future Electricity System: Overview, Concept, Model Structure, and Mechanism," Energies, MDPI, vol. 13(16), pages 1-26, August.
    20. Xu, Xuesong & Xu, Kai & Zeng, Ziyang & Tang, Jiale & He, Yuanxing & Shi, Guangze & Zhang, Tao, 2024. "Collaborative optimization of multi-energy multi-microgrid system: A hierarchical trust-region multi-agent reinforcement learning approach," Applied Energy, Elsevier, vol. 375(C).

    More about this item

    Keywords

    ;
    ;
    ;
    ;
    ;
    ;

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:appene:v:379:y:2025:i:c:s0306261924022141. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/wps/find/journaldescription.cws_home/405891/description#description .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.