IDEAS home Printed from https://ideas.repec.org/a/gam/jeners/v16y2023i3p1334-d1048005.html
   My bibliography  Save this article

Dual-Layer Q-Learning Strategy for Energy Management of Battery Storage in Grid-Connected Microgrids

Author

Listed:
  • Khawaja Haider Ali

    (Faculty of Environment, Science and Economy, University of Exeter, Penryn Campus, Cornwall TR10 9FE, UK
    Department of Electrical Engineering, Sukkur IBA University, Sukkur 65200, Pakistan)

  • Mohammad Abusara

    (Faculty of Environment, Science and Economy, University of Exeter, Penryn Campus, Cornwall TR10 9FE, UK)

  • Asif Ali Tahir

    (Faculty of Environment, Science and Economy, University of Exeter, Penryn Campus, Cornwall TR10 9FE, UK)

  • Saptarshi Das

    (Faculty of Environment, Science and Economy, University of Exeter, Penryn Campus, Cornwall TR10 9FE, UK)

Abstract

Real-time energy management of battery storage in grid-connected microgrids can be very challenging due to the intermittent nature of renewable energy sources (RES), load variations, and variable grid tariffs. Two reinforcement learning (RL)–based energy management systems have been previously used, namely, offline and online methods. In offline RL, the agent learns the optimum policy using forecasted generation and load data. Once the convergence is achieved, battery commands are dispatched in real time. The performance of this strategy highly depends on the accuracy of the forecasted data. An agent in online RL learns the best policy by interacting with the system in real time using real data. Online RL deals better with the forecasted error but can take a longer time to converge. This paper proposes a novel dual layer Q -learning strategy to address this challenge. The first (upper) layer is conducted offline to produce directive commands for the battery system for a 24 h horizon. It uses forecasted data for generation and load. The second (lower) Q -learning-based layer refines these battery commands every 15 min by considering the changes happening in the RES and load demand in real time. This decreases the overall operating cost of the microgrid as compared with online RL by reducing the convergence time. The superiority of the proposed strategy (dual-layer RL) has been verified by simulation results after comparing it with individual offline and online RL algorithms.

Suggested Citation

  • Khawaja Haider Ali & Mohammad Abusara & Asif Ali Tahir & Saptarshi Das, 2023. "Dual-Layer Q-Learning Strategy for Energy Management of Battery Storage in Grid-Connected Microgrids," Energies, MDPI, vol. 16(3), pages 1-17, January.
  • Handle: RePEc:gam:jeners:v:16:y:2023:i:3:p:1334-:d:1048005
    as

    Download full text from publisher

    File URL: https://www.mdpi.com/1996-1073/16/3/1334/pdf
    Download Restriction: no

    File URL: https://www.mdpi.com/1996-1073/16/3/1334/
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. Oussama Ouramdane & Elhoussin Elbouchikhi & Yassine Amirat & Ehsan Sedgh Gooya, 2021. "Optimal Sizing and Energy Management of Microgrids with Vehicle-to-Grid Technology: A Critical Review and Future Trends," Energies, MDPI, vol. 14(14), pages 1-45, July.
    2. Chen, Pengzhan & Liu, Mengchao & Chen, Chuanxi & Shang, Xin, 2019. "A battery management strategy in microgrid for personalized customer requirements," Energy, Elsevier, vol. 189(C).
    3. Do, Linh Phuong Catherine & Lyócsa, Štefan & Molnár, Peter, 2021. "Residual electricity demand: An empirical investigation," Applied Energy, Elsevier, vol. 283(C).
    4. Khawaja Haider Ali & Marvin Sigalo & Saptarshi Das & Enrico Anderlini & Asif Ali Tahir & Mohammad Abusara, 2021. "Reinforcement Learning for Energy-Storage Systems in Grid-Connected Microgrids: An Investigation of Online vs. Offline Implementation," Energies, MDPI, vol. 14(18), pages 1-18, September.
    5. Sunyong Kim & Hyuk Lim, 2018. "Reinforcement Learning Based Energy Management Algorithm for Smart Energy Buildings," Energies, MDPI, vol. 11(8), pages 1-19, August.
    6. Brida V. Mbuwir & Frederik Ruelens & Fred Spiessens & Geert Deconinck, 2017. "Battery Energy Management in a Microgrid Using Batch Reinforcement Learning," Energies, MDPI, vol. 10(11), pages 1-19, November.
    7. Hussein Jumma Jabir & Jiashen Teh & Dahaman Ishak & Hamza Abunima, 2018. "Impacts of Demand-Side Management on Electrical Power Systems: A Review," Energies, MDPI, vol. 11(5), pages 1-19, April.
    8. Ying Ji & Jianhui Wang & Jiacan Xu & Xiaoke Fang & Huaguang Zhang, 2019. "Real-Time Energy Management of a Microgrid Using Deep Reinforcement Learning," Energies, MDPI, vol. 12(12), pages 1-21, June.
    9. Guo, Chenyu & Wang, Xin & Zheng, Yihui & Zhang, Feng, 2022. "Real-time optimal energy management of microgrid with uncertainties based on deep reinforcement learning," Energy, Elsevier, vol. 238(PC).
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Bio Gassi, Karim & Baysal, Mustafa, 2023. "Improving real-time energy decision-making model with an actor-critic agent in modern microgrids with energy storage devices," Energy, Elsevier, vol. 263(PE).
    2. Grace Muriithi & Sunetra Chowdhury, 2021. "Optimal Energy Management of a Grid-Tied Solar PV-Battery Microgrid: A Reinforcement Learning Approach," Energies, MDPI, vol. 14(9), pages 1-24, May.
    3. Zhu, Ziqing & Hu, Ze & Chan, Ka Wing & Bu, Siqi & Zhou, Bin & Xia, Shiwei, 2023. "Reinforcement learning in deregulated energy market: A comprehensive review," Applied Energy, Elsevier, vol. 329(C).
    4. Lilia Tightiz & Joon Yoo, 2022. "A Review on a Data-Driven Microgrid Management System Integrating an Active Distribution Network: Challenges, Issues, and New Trends," Energies, MDPI, vol. 15(22), pages 1-24, November.
    5. Ritu Kandari & Neeraj Neeraj & Alexander Micallef, 2022. "Review on Recent Strategies for Integrating Energy Storage Systems in Microgrids," Energies, MDPI, vol. 16(1), pages 1-24, December.
    6. Khawaja Haider Ali & Marvin Sigalo & Saptarshi Das & Enrico Anderlini & Asif Ali Tahir & Mohammad Abusara, 2021. "Reinforcement Learning for Energy-Storage Systems in Grid-Connected Microgrids: An Investigation of Online vs. Offline Implementation," Energies, MDPI, vol. 14(18), pages 1-18, September.
    7. Dimitrios Vamvakas & Panagiotis Michailidis & Christos Korkas & Elias Kosmatopoulos, 2023. "Review and Evaluation of Reinforcement Learning Frameworks on Smart Grid Applications," Energies, MDPI, vol. 16(14), pages 1-38, July.
    8. Mudhafar Al-Saadi & Maher Al-Greer & Michael Short, 2021. "Strategies for Controlling Microgrid Networks with Energy Storage Systems: A Review," Energies, MDPI, vol. 14(21), pages 1-45, November.
    9. Alqahtani, Mohammed & Hu, Mengqi, 2022. "Dynamic energy scheduling and routing of multiple electric vehicles using deep reinforcement learning," Energy, Elsevier, vol. 244(PA).
    10. Wang, Yi & Qiu, Dawei & Sun, Mingyang & Strbac, Goran & Gao, Zhiwei, 2023. "Secure energy management of multi-energy microgrid: A physical-informed safe reinforcement learning approach," Applied Energy, Elsevier, vol. 335(C).
    11. Alexander N. Kozlov & Nikita V. Tomin & Denis N. Sidorov & Electo E. S. Lora & Victor G. Kurbatsky, 2020. "Optimal Operation Control of PV-Biomass Gasifier-Diesel-Hybrid Systems Using Reinforcement Learning Techniques," Energies, MDPI, vol. 13(10), pages 1-20, May.
    12. Pinciroli, Luca & Baraldi, Piero & Compare, Michele & Zio, Enrico, 2023. "Optimal operation and maintenance of energy storage systems in grid-connected microgrids by deep reinforcement learning," Applied Energy, Elsevier, vol. 352(C).
    13. Ahmed M. Abed & Ali AlArjani, 2022. "The Neural Network Classifier Works Efficiently on Searching in DQN Using the Autonomous Internet of Things Hybridized by the Metaheuristic Techniques to Reduce the EVs’ Service Scheduling Time," Energies, MDPI, vol. 15(19), pages 1-25, September.
    14. Anis ur Rehman & Muhammad Ali & Sheeraz Iqbal & Aqib Shafiq & Nasim Ullah & Sattam Al Otaibi, 2022. "Artificial Intelligence-Based Control and Coordination of Multiple PV Inverters for Reactive Power/Voltage Control of Power Distribution Networks," Energies, MDPI, vol. 15(17), pages 1-13, August.
    15. Van-Hai Bui & Akhtar Hussain & Hak-Man Kim, 2019. "Q-Learning-Based Operation Strategy for Community Battery Energy Storage System (CBESS) in Microgrid System," Energies, MDPI, vol. 12(9), pages 1-17, May.
    16. Harri Aaltonen & Seppo Sierla & Rakshith Subramanya & Valeriy Vyatkin, 2021. "A Simulation Environment for Training a Reinforcement Learning Agent Trading a Battery Storage," Energies, MDPI, vol. 14(17), pages 1-20, September.
    17. Omar Al-Ani & Sanjoy Das, 2022. "Reinforcement Learning: Theory and Applications in HEMS," Energies, MDPI, vol. 15(17), pages 1-37, September.
    18. Yujian Ye & Dawei Qiu & Huiyu Wang & Yi Tang & Goran Strbac, 2021. "Real-Time Autonomous Residential Demand Response Management Based on Twin Delayed Deep Deterministic Policy Gradient Learning," Energies, MDPI, vol. 14(3), pages 1-22, January.
    19. Ying Ji & Jianhui Wang & Jiacan Xu & Xiaoke Fang & Huaguang Zhang, 2019. "Real-Time Energy Management of a Microgrid Using Deep Reinforcement Learning," Energies, MDPI, vol. 12(12), pages 1-21, June.
    20. Zhou, Yanting & Ma, Zhongjing & Zhang, Jinhui & Zou, Suli, 2022. "Data-driven stochastic energy management of multi energy system using deep reinforcement learning," Energy, Elsevier, vol. 261(PA).

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:gam:jeners:v:16:y:2023:i:3:p:1334-:d:1048005. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: MDPI Indexing Manager (email available below). General contact details of provider: https://www.mdpi.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.