Hybrid multi-agent deep reinforcement learning for multi-type mobile resources dispatching under transportation and power network recovery
Author
Abstract
Suggested Citation
DOI: 10.1016/j.apenergy.2025.126423
Download full text from publisher
As the access to this document is restricted, you may want to
for a different version of it.References listed on IDEAS
- Perera, A.T.D. & Kamalaruban, Parameswaran, 2021. "Applications of reinforcement learning in energy systems," Renewable and Sustainable Energy Reviews, Elsevier, vol. 137(C).
- Wang, Yi & Qiu, Dawei & Strbac, Goran, 2022. "Multi-agent deep reinforcement learning for resilience-driven routing and scheduling of mobile energy storage systems," Applied Energy, Elsevier, vol. 310(C).
- Du, Ying & Zhang, Junxiang & Chen, Yuntian & Liu, Zhengguang & Zhang, Haoran & Ji, Haoran & Wang, Chengshan & Yan, Jinyue, 2025. "Impact of electric vehicles on post-disaster power supply restoration of urban distribution systems," Applied Energy, Elsevier, vol. 383(C).
- Zhang, Wangxin & Han, Qiang & Shang, Wen-Long & Xu, Chengshun, 2024. "Seismic resilience assessment of interdependent urban transportation-electric power system under uncertainty," Transportation Research Part A: Policy and Practice, Elsevier, vol. 183(C).
- Shang, Yuwei & Wu, Wenchuan & Guo, Jianbo & Ma, Zhao & Sheng, Wanxing & Lv, Zhe & Fu, Chenran, 2020. "Stochastic dispatch of energy storage in microgrids: An augmented reinforcement learning approach," Applied Energy, Elsevier, vol. 261(C).
- Qiu, Dawei & Wang, Yi & Hua, Weiqi & Strbac, Goran, 2023. "Reinforcement learning for electric vehicle applications in power systems:A critical review," Renewable and Sustainable Energy Reviews, Elsevier, vol. 173(C).
- Sun, Shaohua & Li, Gengfeng & Yang, Qiming & Bie, Zhaohong, 2024. "Co-optimize recovery modeling for transportation and power network with multi-type mobile resources dispatching," Applied Energy, Elsevier, vol. 366(C).
- Alqahtani, Mohammed & Hu, Mengqi, 2022. "Dynamic energy scheduling and routing of multiple electric vehicles using deep reinforcement learning," Energy, Elsevier, vol. 244(PA).
- Qiu, Dawei & Wang, Yi & Zhang, Tingqi & Sun, Mingyang & Strbac, Goran, 2023. "Hierarchical multi-agent reinforcement learning for repair crews dispatch control towards multi-energy microgrid resilience," Applied Energy, Elsevier, vol. 336(C).
- Juan P. Montoya-Rincon & Said A. Mejia-Manrique & Shams Azad & Masoud Ghandehari & Eric W. Harmsen & Reza Khanbilvardi & Jorge E. Gonzalez-Cruz, 2023. "A socio-technical approach for the assessment of critical infrastructure system vulnerability in extreme weather events," Nature Energy, Nature, vol. 8(9), pages 1002-1012, September.
- Zhong, Jian & Chen, Chen & Zhang, Haochen & Shen, Wentao & Fan, Zhong & Qiu, Dawei & Bie, Zhaohong, 2025. "Resilient mobile energy storage resources-based microgrid formation considering power-transportation-information network interdependencies," Applied Energy, Elsevier, vol. 389(C).
Most related items
These are the items that most often cite the same works as this one and are cited by the same works as this one.- Li, Sichen & Hu, Weihao & Cao, Di & Chen, Zhe & Huang, Qi & Blaabjerg, Frede & Liao, Kaiji, 2023. "Physics-model-free heat-electricity energy management of multiple microgrids based on surrogate model-enabled multi-agent deep reinforcement learning," Applied Energy, Elsevier, vol. 346(C).
- Qiu, Dawei & Wang, Yi & Hua, Weiqi & Strbac, Goran, 2023. "Reinforcement learning for electric vehicle applications in power systems:A critical review," Renewable and Sustainable Energy Reviews, Elsevier, vol. 173(C).
- Su, Chutian & Wang, Yi & Strbac, Goran, 2025. "Coordinated electric vehicles dispatch for multi-service provisions: A comprehensive review of modelling and coordination approaches," Renewable and Sustainable Energy Reviews, Elsevier, vol. 223(C).
- Omar Al-Ani & Sanjoy Das, 2022. "Reinforcement Learning: Theory and Applications in HEMS," Energies, MDPI, vol. 15(17), pages 1-37, September.
- Zhu, Ziqing & Hu, Ze & Chan, Ka Wing & Bu, Siqi & Zhou, Bin & Xia, Shiwei, 2023. "Reinforcement learning in deregulated energy market: A comprehensive review," Applied Energy, Elsevier, vol. 329(C).
- Harrold, Daniel J.B. & Cao, Jun & Fan, Zhong, 2022. "Renewable energy integration and microgrid energy trading using multi-agent deep reinforcement learning," Applied Energy, Elsevier, vol. 318(C).
- Xu, Hairun & Zhang, Ao & Wang, Qingle & Hu, Yang & Fang, Fang & Cheng, Long, 2025. "Quantum Reinforcement Learning for real-time optimization in Electric Vehicle charging systems," Applied Energy, Elsevier, vol. 383(C).
- Yang, Ruizhang & Xiao, Zhuang & Xiong, Wei & Hou, Yunhe, 2026. "Coordinative multi-stage approach to railway energy system resilience enhancement: From risk-aware FTPSS planning to emergency energy management and adaptive train control," Applied Energy, Elsevier, vol. 402(PB).
- Mahmud, Sakib & Sayed, Aya Nabil & Himeur, Yassine & Nhlabatsi, Armstrong & Bensaali, Faycal, 2026. "A comprehensive review of deep reinforcement learning applications from centralized power generation to modern energy internet frameworks," Renewable and Sustainable Energy Reviews, Elsevier, vol. 226(PE).
- Zhou, Xinlei & Du, Han & Xue, Shan & Ma, Zhenjun, 2024. "Recent advances in data mining and machine learning for enhanced building energy management," Energy, Elsevier, vol. 307(C).
- Lu, Qing-Chang & Wang, Shixin & Xu, Peng-Cheng & Li, Jing & Meng, Xu & Hussain, Adil, 2025. "Modeling the dependency relationship of coupled power and transportation networks," Energy, Elsevier, vol. 320(C).
- Qiu, Dawei & Wang, Yi & Zhang, Tingqi & Sun, Mingyang & Strbac, Goran, 2023. "Hierarchical multi-agent reinforcement learning for repair crews dispatch control towards multi-energy microgrid resilience," Applied Energy, Elsevier, vol. 336(C).
- Hu, Xiaorui & Guo, Haotian & Lao, Keng-Weng & Hao, Junkun & Liu, Fengrui & Ren, Zhongyu, 2025. "MiniRocket-MARL synergy for storm tide resilience: MESS-DV enhanced recovery in coastal distribution networks," Applied Energy, Elsevier, vol. 401(PB).
- Wang, Yi & Qiu, Dawei & He, Yinglong & Zhou, Quan & Strbac, Goran, 2023. "Multi-agent reinforcement learning for electric vehicle decarbonized routing and scheduling," Energy, Elsevier, vol. 284(C).
- Song, Yingjie & Ngoduy, Dong & Ding, Chuan, 2026. "Coordinated emergency resource allocation for resilience enhancement in post-disaster electrified transportation networks," Journal of Transport Geography, Elsevier, vol. 130(C).
- Yao, Ganzhou & Luo, Zirong & Lu, Zhongyue & Wang, Mangkuan & Shang, Jianzhong & Guerrerob, Josep M., 2023. "Unlocking the potential of wave energy conversion: A comprehensive evaluation of advanced maximum power point tracking techniques and hybrid strategies for sustainable energy harvesting," Renewable and Sustainable Energy Reviews, Elsevier, vol. 185(C).
- Liangcai Zhou & Long Huo & Linlin Liu & Hao Xu & Rui Chen & Xin Chen, 2025. "Optimal Power Flow for High Spatial and Temporal Resolution Power Systems with High Renewable Energy Penetration Using Multi-Agent Deep Reinforcement Learning," Energies, MDPI, vol. 18(7), pages 1-14, April.
- An, Sihai & Qiu, Jing & Lin, Jiafeng & Yao, Zongyu & Liang, Qijun & Lu, Xin, 2025. "Planning of a multi-agent mobile robot-based adaptive charging network for enhancing power system resilience under extreme conditions," Applied Energy, Elsevier, vol. 395(C).
- Lan, Penghang & Chen, She & Li, Qihang & Li, Kelin & Wang, Feng & Zhao, Yaoxun, 2024. "Intelligent hydrogen-ammonia combined energy storage system with deep reinforcement learning," Renewable Energy, Elsevier, vol. 237(PB).
- Ahmad, Tanveer & Madonski, Rafal & Zhang, Dongdong & Huang, Chao & Mujeeb, Asad, 2022. "Data-driven probabilistic machine learning in sustainable smart energy/smart energy systems: Key developments, challenges, and future research opportunities in the context of smart grid paradigm," Renewable and Sustainable Energy Reviews, Elsevier, vol. 160(C).
Corrections
All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:appene:v:399:y:2025:i:c:s0306261925011535. See general information about how to correct material in RePEc.
If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.
If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .
If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/wps/find/journaldescription.cws_home/405891/description#description .
Please note that corrections may take a couple of weeks to filter through the various RePEc services.
Printed from https://ideas.repec.org/a/eee/appene/v399y2025ics0306261925011535.html