IDEAS home Printed from https://ideas.repec.org/a/gam/jeners/v10y2017i11p1846-d118541.html
   My bibliography  Save this article

Battery Energy Management in a Microgrid Using Batch Reinforcement Learning

Author

Listed:
  • Brida V. Mbuwir

    (ESAT/Electa, KU Leuven, Kasteelpark Arenberg 10 bus 2445, BE-3001 Leuven, Belgium
    Energy Department, EnergyVille, Thor Park, Poort Genk 8130, 3600 Genk, Belgium)

  • Frederik Ruelens

    (ESAT/Electa, KU Leuven, Kasteelpark Arenberg 10 bus 2445, BE-3001 Leuven, Belgium
    Energy Department, EnergyVille, Thor Park, Poort Genk 8130, 3600 Genk, Belgium)

  • Fred Spiessens

    (Energy Department, EnergyVille, Thor Park, Poort Genk 8130, 3600 Genk, Belgium
    Energy Department, Vlaamse Instelling voor Technologisch Onderzoek (VITO), Boeretang 200, B-2400 Mol, Belgium)

  • Geert Deconinck

    (ESAT/Electa, KU Leuven, Kasteelpark Arenberg 10 bus 2445, BE-3001 Leuven, Belgium
    Energy Department, EnergyVille, Thor Park, Poort Genk 8130, 3600 Genk, Belgium)

Abstract

Motivated by recent developments in batch Reinforcement Learning (RL), this paper contributes to the application of batch RL in energy management in microgrids. We tackle the challenge of finding a closed-loop control policy to optimally schedule the operation of a storage device, in order to maximize self-consumption of local photovoltaic production in a microgrid. In this work, the fitted Q-iteration algorithm, a standard batch RL technique, is used by an RL agent to construct a control policy. The proposed method is data-driven and uses a state-action value function to find an optimal scheduling plan for a battery. The battery’s charge and discharge efficiencies, and the nonlinearity in the microgrid due to the inverter’s efficiency are taken into account. The proposed approach has been tested by simulation in a residential setting using data from Belgian residential consumers. The developed framework is benchmarked with a model-based technique, and the simulation results show a performance gap of 19%. The simulation results provide insight for developing optimal policies in more realistically-scaled and interconnected microgrids and for including uncertainties in generation and consumption for which white-box models become inaccurate and/or infeasible.

Suggested Citation

  • Brida V. Mbuwir & Frederik Ruelens & Fred Spiessens & Geert Deconinck, 2017. "Battery Energy Management in a Microgrid Using Batch Reinforcement Learning," Energies, MDPI, vol. 10(11), pages 1-19, November.
  • Handle: RePEc:gam:jeners:v:10:y:2017:i:11:p:1846-:d:118541
    as

    Download full text from publisher

    File URL: https://www.mdpi.com/1996-1073/10/11/1846/pdf
    Download Restriction: no

    File URL: https://www.mdpi.com/1996-1073/10/11/1846/
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. Frederik Ruelens & Sandro Iacovella & Bert J. Claessens & Ronnie Belmans, 2015. "Learning Agent for a Heat-Pump Thermostat with a Set-Back Strategy Using Model-Free Reinforcement Learning," Energies, MDPI, vol. 8(8), pages 1-19, August.
    2. Kuznetsova, Elizaveta & Li, Yan-Fu & Ruiz, Carlos & Zio, Enrico & Ault, Graham & Bell, Keith, 2013. "Reinforcement learning for microgrid energy management," Energy, Elsevier, vol. 59(C), pages 133-146.
    3. Zhao, Bo & Xue, Meidong & Zhang, Xuesong & Wang, Caisheng & Zhao, Junhui, 2015. "An MAS based energy management system for a stand-alone microgrid at high altitude," Applied Energy, Elsevier, vol. 143(C), pages 251-261.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Nyong-Bassey, Bassey Etim & Giaouris, Damian & Patsios, Charalampos & Papadopoulou, Simira & Papadopoulos, Athanasios I. & Walker, Sara & Voutetakis, Spyros & Seferlis, Panos & Gadoue, Shady, 2020. "Reinforcement learning based adaptive power pinch analysis for energy management of stand-alone hybrid energy storage systems considering uncertainty," Energy, Elsevier, vol. 193(C).
    2. Yassine Chemingui & Adel Gastli & Omar Ellabban, 2020. "Reinforcement Learning-Based School Energy Management System," Energies, MDPI, vol. 13(23), pages 1-21, December.
    3. Yang, Ting & Zhao, Liyuan & Li, Wei & Zomaya, Albert Y., 2021. "Dynamic energy dispatch strategy for integrated energy system based on improved deep reinforcement learning," Energy, Elsevier, vol. 235(C).
    4. Hao Liang & Weihua Zhuang, 2014. "Stochastic Modeling and Optimization in a Microgrid: A Survey," Energies, MDPI, vol. 7(4), pages 1-24, March.
    5. Bhowmik, Chiranjib & Bhowmik, Sumit & Ray, Amitava & Pandey, Krishna Murari, 2017. "Optimal green energy planning for sustainable development: A review," Renewable and Sustainable Energy Reviews, Elsevier, vol. 71(C), pages 796-813.
    6. Omar Al-Ani & Sanjoy Das, 2022. "Reinforcement Learning: Theory and Applications in HEMS," Energies, MDPI, vol. 15(17), pages 1-37, September.
    7. Correa-Jullian, Camila & López Droguett, Enrique & Cardemil, José Miguel, 2020. "Operation scheduling in a solar thermal system: A reinforcement learning-based framework," Applied Energy, Elsevier, vol. 268(C).
    8. Waseem Akram & Muaz Niazi & Laszlo Barna Iantovics & Athanasios V. Vasilakos, 2019. "Towards Agent-Based Model Specification of Smart Grid: A Cognitive Agent-Based Computing Approach," Interdisciplinary Description of Complex Systems - scientific journal, Croatian Interdisciplinary Society Provider Homepage: http://indecs.eu, vol. 17(3-B), pages 546-585.
    9. Boukettaya, Ghada & Krichen, Lotfi, 2014. "A dynamic power management strategy of a grid connected hybrid generation system using wind, photovoltaic and Flywheel Energy Storage System in residential applications," Energy, Elsevier, vol. 71(C), pages 148-159.
    10. Ahmad Khan, Aftab & Naeem, Muhammad & Iqbal, Muhammad & Qaisar, Saad & Anpalagan, Alagan, 2016. "A compendium of optimization objectives, constraints, tools and algorithms for energy management in microgrids," Renewable and Sustainable Energy Reviews, Elsevier, vol. 58(C), pages 1664-1683.
    11. Clarke, Will Challis & Brear, Michael John & Manzie, Chris, 2020. "Control of an isolated microgrid using hierarchical economic model predictive control," Applied Energy, Elsevier, vol. 280(C).
    12. Harrold, Daniel J.B. & Cao, Jun & Fan, Zhong, 2022. "Data-driven battery operation for energy arbitrage using rainbow deep reinforcement learning," Energy, Elsevier, vol. 238(PC).
    13. Li, Jianwei & Xiong, Rui & Mu, Hao & Cornélusse, Bertrand & Vanderbemden, Philippe & Ernst, Damien & Yuan, Weijia, 2018. "Design and real-time test of a hybrid energy storage system in the microgrid with the benefit of improving the battery lifetime," Applied Energy, Elsevier, vol. 218(C), pages 470-478.
    14. Zhang, Li & Gao, Yan & Zhu, Hongbo & Tao, Li, 2022. "Bi-level stochastic real-time pricing model in multi-energy generation system: A reinforcement learning approach," Energy, Elsevier, vol. 239(PA).
    15. Weitzel, Timm & Glock, Christoph H., 2018. "Energy management for stationary electric energy storage systems: A systematic literature review," European Journal of Operational Research, Elsevier, vol. 264(2), pages 582-606.
    16. Chen, Pengzhan & Liu, Mengchao & Chen, Chuanxi & Shang, Xin, 2019. "A battery management strategy in microgrid for personalized customer requirements," Energy, Elsevier, vol. 189(C).
    17. Gruber, J.K. & Huerta, F. & Matatagui, P. & Prodanović, M., 2015. "Advanced building energy management based on a two-stage receding horizon optimization," Applied Energy, Elsevier, vol. 160(C), pages 194-205.
    18. Muhammad Asghar Majeed & Furqan Asghar & Muhammad Imtiaz Hussain & Waseem Amjad & Anjum Munir & Hammad Armghan & Jun-Tae Kim, 2022. "Adaptive Dynamic Control Based Optimization of Renewable Energy Resources for Grid-Tied Microgrids," Sustainability, MDPI, vol. 14(3), pages 1-14, February.
    19. Sabarathinam Srinivasan & Suresh Kumarasamy & Zacharias E. Andreadakis & Pedro G. Lind, 2023. "Artificial Intelligence and Mathematical Models of Power Grids Driven by Renewable Energy Sources: A Survey," Energies, MDPI, vol. 16(14), pages 1-56, July.
    20. Zhu, Ziqing & Hu, Ze & Chan, Ka Wing & Bu, Siqi & Zhou, Bin & Xia, Shiwei, 2023. "Reinforcement learning in deregulated energy market: A comprehensive review," Applied Energy, Elsevier, vol. 329(C).

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:gam:jeners:v:10:y:2017:i:11:p:1846-:d:118541. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: MDPI Indexing Manager (email available below). General contact details of provider: https://www.mdpi.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.