IDEAS home Printed from https://ideas.repec.org/a/eee/appene/v392y2025ics0306261925007202.html
   My bibliography  Save this article

Multi-agent deep reinforcement learning based demand response and energy management for heavy industries with discrete manufacturing systems

Author

Listed:
  • Bashyal, Atit
  • Boroukhian, Tina
  • Veerachanchai, Pakin
  • Naransukh, Myanganbayar
  • Wicaksono, Hendro

Abstract

Energy-centric decarbonization of heavy industries, such as steel and cement, necessitates their participation in integrating Renewable Energy Sources (RES) and effective Demand Response (DR) programs. This situation has created the opportunities to research control algorithms in diverse DR scenarios. Further, the industrial sector’s unique challenges, including the diversity of operations and the need for uninterrupted production, bring unique challenges in designing and implementing control algorithms. Reinforcement learning (RL) methods are practical solutions to the unique challenges faced by the industrial sector. Nevertheless, research in RL for industrial demand response has not yet achieved the level of standardization seen in other areas of RL research, hindering broader progress. To propel the research progress, we propose a multi-agent reinforcement learning (MARL)-based energy management system designed to optimize energy consumption in energy-intensive industrial settings by leveraging dynamic pricing DR schemes. The study highlights the creation of a MARL environment and addresses these challenges by designing a general framework that allows researchers to replicate and implement MARL environments for industrial sectors. The proposed framework incorporates a Partially Observable Markov Decision Process (POMDP) to model energy consumption and production processes while introducing buffer storage constraints and a flexible reward function that balances production efficiency and cost reduction. The paper evaluates the framework through experimental validation within a steel powder manufacturing facility. The experimental results validate our framework and also demonstrate the effectiveness of the MARL-based energy management system.

Suggested Citation

  • Bashyal, Atit & Boroukhian, Tina & Veerachanchai, Pakin & Naransukh, Myanganbayar & Wicaksono, Hendro, 2025. "Multi-agent deep reinforcement learning based demand response and energy management for heavy industries with discrete manufacturing systems," Applied Energy, Elsevier, vol. 392(C).
  • Handle: RePEc:eee:appene:v:392:y:2025:i:c:s0306261925007202
    DOI: 10.1016/j.apenergy.2025.125990
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0306261925007202
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.apenergy.2025.125990?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to

    for a different version of it.

    References listed on IDEAS

    as
    1. Ajagekar, Akshay & Decardi-Nelson, Benjamin & You, Fengqi, 2024. "Energy management for demand response in networked greenhouses with multi-agent deep reinforcement learning," Applied Energy, Elsevier, vol. 355(C).
    2. Lu, Renzhi & Li, Yi-Chang & Li, Yuting & Jiang, Junhui & Ding, Yuemin, 2020. "Multi-agent deep reinforcement learning based demand response for discrete manufacturing systems energy management," Applied Energy, Elsevier, vol. 276(C).
    3. Lu, Renzhi & Bai, Ruichang & Huang, Yuan & Li, Yuting & Jiang, Junhui & Ding, Yuemin, 2021. "Data-driven real-time price-based demand response for industrial facilities energy management," Applied Energy, Elsevier, vol. 283(C).
    4. Ruiz Duarte, José Luis & Fan, Neng & Jin, Tongdan, 2020. "Multi-process production scheduling with variable renewable integration and demand response," European Journal of Operational Research, Elsevier, vol. 281(1), pages 186-200.
    5. Vázquez-Canteli, José R. & Nagy, Zoltán, 2019. "Reinforcement learning for demand response: A review of algorithms and modeling techniques," Applied Energy, Elsevier, vol. 235(C), pages 1072-1089.
    6. Ye, Jin & Wang, Xianlian & Hua, Qingsong & Sun, Li, 2024. "Deep reinforcement learning based energy management of a hybrid electricity-heat-hydrogen energy system with demand response," Energy, Elsevier, vol. 305(C).
    7. Yang, Shiyu & Oliver Gao, H. & You, Fengqi, 2022. "Model predictive control in phase-change-material-wallboard-enhanced building energy management considering electricity price dynamics," Applied Energy, Elsevier, vol. 326(C).
    8. Babonneau, Frédéric & Caramanis, Michael & Haurie, Alain, 2016. "A linear programming model for power distribution with demand response and variable renewable energy," Applied Energy, Elsevier, vol. 181(C), pages 83-95.
    9. Xie, Jiahan & Ajagekar, Akshay & You, Fengqi, 2023. "Multi-Agent attention-based deep reinforcement learning for demand response in grid-responsive buildings," Applied Energy, Elsevier, vol. 342(C).
    10. Zhu, Dafeng & Yang, Bo & Liu, Yuxiang & Wang, Zhaojian & Ma, Kai & Guan, Xinping, 2022. "Energy management based on multi-agent deep reinforcement learning for a multi-energy industrial park," Applied Energy, Elsevier, vol. 311(C).
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Ajagekar, Akshay & Decardi-Nelson, Benjamin & You, Fengqi, 2024. "Energy management for demand response in networked greenhouses with multi-agent deep reinforcement learning," Applied Energy, Elsevier, vol. 355(C).
    2. Muhammad Ikram & Daryoush Habibi & Asma Aziz, 2025. "Networked Multi-Agent Deep Reinforcement Learning Framework for the Provision of Ancillary Services in Hybrid Power Plants," Energies, MDPI, vol. 18(10), pages 1-34, May.
    3. Xie, Jiahan & Ajagekar, Akshay & You, Fengqi, 2023. "Multi-Agent attention-based deep reinforcement learning for demand response in grid-responsive buildings," Applied Energy, Elsevier, vol. 342(C).
    4. Leherbauer, Dominik & Schulz, Julia & Egyed, Alexander & Hehenberger, Peter, 2025. "Demand-side management in less energy-intensive industries: A systematic mapping study," Renewable and Sustainable Energy Reviews, Elsevier, vol. 212(C).
    5. Savino, Sabrina & Minella, Tommaso & Nagy, Zoltán & Capozzoli, Alfonso, 2025. "A scalable demand-side energy management control strategy for large residential districts based on an attention-driven multi-agent DRL approach," Applied Energy, Elsevier, vol. 393(C).
    6. Pavirani, Fabio & Van Gompel, Jonas & Karimi Madahi, Seyed Soroush & Claessens, Bert & Develder, Chris, 2025. "Predicting and publishing accurate imbalance prices using Monte Carlo Tree Search," Applied Energy, Elsevier, vol. 392(C).
    7. Liu, Jiejie & Ma, Yanan & Chen, Ying & Zhao, Chunlu & Meng, Xianyang & Wu, Jiangtao, 2025. "Multi-agent deep reinforcement learning-based cooperative energy management for regional integrated energy system incorporating active demand-side management," Energy, Elsevier, vol. 319(C).
    8. Omar Al-Ani & Sanjoy Das, 2022. "Reinforcement Learning: Theory and Applications in HEMS," Energies, MDPI, vol. 15(17), pages 1-37, September.
    9. Golmohamadi, Hessam, 2022. "Demand-side management in industrial sector: A review of heavy industries," Renewable and Sustainable Energy Reviews, Elsevier, vol. 156(C).
    10. Lu, Renzhi & Bai, Ruichang & Ding, Yuemin & Wei, Min & Jiang, Junhui & Sun, Mingyang & Xiao, Feng & Zhang, Hai-Tao, 2021. "A hybrid deep learning-based online energy management scheme for industrial microgrid," Applied Energy, Elsevier, vol. 304(C).
    11. Magdalena Krystyna Wyrwicka & Ewa Więcek-Janka & Łukasz Brzeziński, 2023. "Transition to Sustainable Energy System for Smart Cities—Literature Review," Energies, MDPI, vol. 16(21), pages 1-26, October.
    12. Máximo A. Domínguez-Garabitos & Víctor S. Ocaña-Guevara & Félix Santos-García & Adriana Arango-Manrique & Miguel Aybar-Mejía, 2022. "A Methodological Proposal for Implementing Demand-Shifting Strategies in the Wholesale Electricity Market," Energies, MDPI, vol. 15(4), pages 1-28, February.
    13. Yun, Lingxiang & Li, Lin & Ma, Shuaiyin, 2022. "Demand response for manufacturing systems considering the implications of fast-charging battery powered material handling equipment," Applied Energy, Elsevier, vol. 310(C).
    14. Eduardo J. Salazar & Mauro Jurado & Mauricio E. Samper, 2023. "Reinforcement Learning-Based Pricing and Incentive Strategy for Demand Response in Smart Grids," Energies, MDPI, vol. 16(3), pages 1-33, February.
    15. Sun, Fangyuan & Kong, Xiangyu & Wu, Jianzhong & Gao, Bixuan & Chen, Ke & Lu, Ning, 2022. "DSM pricing method based on A3C and LSTM under cloud-edge environment," Applied Energy, Elsevier, vol. 315(C).
    16. Mohammed Jasim M. Al Essa, 2025. "A review on price-driven energy management systems and demand response programs in smart grids," Environment Systems and Decisions, Springer, vol. 45(1), pages 1-22, March.
    17. Pinto, Giuseppe & Kathirgamanathan, Anjukan & Mangina, Eleni & Finn, Donal P. & Capozzoli, Alfonso, 2022. "Enhancing energy management in grid-interactive buildings: A comparison among cooperative and coordinated architectures," Applied Energy, Elsevier, vol. 310(C).
    18. Zheng, Zhuang & Shafique, Muhammad & Luo, Xiaowei & Wang, Shengwei, 2024. "A systematic review towards integrative energy management of smart grids and urban energy systems," Renewable and Sustainable Energy Reviews, Elsevier, vol. 189(PB).
    19. Zeng, Lanting & Qiu, Dawei & Sun, Mingyang, 2022. "Resilience enhancement of multi-agent reinforcement learning-based demand response against adversarial attacks," Applied Energy, Elsevier, vol. 324(C).
    20. Chen, Wei-Han & You, Fengqi, 2024. "Sustainable energy management and control for Decarbonization of complex multi-zone buildings with renewable solar and geothermal energies using machine learning, robust optimization, and predictive c," Applied Energy, Elsevier, vol. 372(C).

    More about this item

    Keywords

    ;
    ;
    ;
    ;
    ;
    ;

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:appene:v:392:y:2025:i:c:s0306261925007202. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/wps/find/journaldescription.cws_home/405891/description#description .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.