IDEAS home Printed from https://ideas.repec.org/a/gam/jeners/v16y2023i14p5326-d1192246.html
   My bibliography  Save this article

Review and Evaluation of Reinforcement Learning Frameworks on Smart Grid Applications

Author

Listed:
  • Dimitrios Vamvakas

    (Center for Research and Technology Hellas, 57001 Thessaloniki, Greece
    Department of Electrical and Computer Engineering, Democritus University of Thrace, 67100 Xanthi, Greece
    These authors contributed equally to this work.)

  • Panagiotis Michailidis

    (Center for Research and Technology Hellas, 57001 Thessaloniki, Greece
    Department of Electrical and Computer Engineering, Democritus University of Thrace, 67100 Xanthi, Greece
    These authors contributed equally to this work.)

  • Christos Korkas

    (Center for Research and Technology Hellas, 57001 Thessaloniki, Greece
    Department of Electrical and Computer Engineering, Democritus University of Thrace, 67100 Xanthi, Greece)

  • Elias Kosmatopoulos

    (Center for Research and Technology Hellas, 57001 Thessaloniki, Greece
    Department of Electrical and Computer Engineering, Democritus University of Thrace, 67100 Xanthi, Greece)

Abstract

With the rise in electricity, gas and oil prices and the persistently high levels of carbon emissions, there is an increasing demand for effective energy management in energy systems, including electrical grids. Recent literature exhibits large potential for optimizing the behavior of such systems towards energy performance, reducing peak loads and exploiting environmentally friendly ways for energy production. However, the primary challenge relies on the optimization of such systems, which introduces significant complexities since they present quite dynamic behavior. Such cyberphysical frameworks usually integrate multiple interconnected components such as power plants, transmission lines, distribution networks and various types of energy-storage systems, while the behavior of these components is affected by various external factors such as user individual requirements, weather conditions, energy demand and market prices. Consequently, traditional optimal control approaches—such as Rule-Based Control (RBC)—prove inadequate to deal with the diverse dynamics which define the behavior of such complicated frameworks. Moreover, even sophisticated techniques—such as Model Predictive Control (MPC)—showcase model-related limitations that hinder the applicability of an optimal control scheme. To this end, AI model-free techniques such as Reinforcement Learning (RL) offer a fruitful potential for embedding efficient optimal control in cases of energy systems. Recent studies present promising results in various fields of engineering, indicating that RL frameworks may prove the key element for delivering efficient optimal control in smart buildings, electric vehicle charging and smart grid applications. The current paper provides a comprehensive review of RL implementations in energy systems frameworks—such as Renewable Energy Sources (RESs), Building Energy-Management Systems (BEMSs) and Electric Vehicle Charging Stations (EVCSs)—illustrating the benefits and the opportunities of such approaches. The work examines more than 80 highly cited papers focusing on recent RL research applications—between 2015 and 2023—and analyzes the model-free RL potential as regards the energy systems’ control optimization in the future.

Suggested Citation

  • Dimitrios Vamvakas & Panagiotis Michailidis & Christos Korkas & Elias Kosmatopoulos, 2023. "Review and Evaluation of Reinforcement Learning Frameworks on Smart Grid Applications," Energies, MDPI, vol. 16(14), pages 1-38, July.
  • Handle: RePEc:gam:jeners:v:16:y:2023:i:14:p:5326-:d:1192246
    as

    Download full text from publisher

    File URL: https://www.mdpi.com/1996-1073/16/14/5326/pdf
    Download Restriction: no

    File URL: https://www.mdpi.com/1996-1073/16/14/5326/
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. Aktas, Ahmet & Erhan, Koray & Özdemir, Sule & Özdemir, Engin, 2018. "Dynamic energy management for photovoltaic power system including hybrid energy storage in smart grid applications," Energy, Elsevier, vol. 162(C), pages 72-82.
    2. Wang, Zhe & Hong, Tianzhen, 2020. "Reinforcement learning for building controls: The opportunities and challenges," Applied Energy, Elsevier, vol. 269(C).
    3. Seppo Sierla & Heikki Ihasalo & Valeriy Vyatkin, 2022. "A Review of Reinforcement Learning Applications to Control of Heating, Ventilation and Air Conditioning Systems," Energies, MDPI, vol. 15(10), pages 1-25, May.
    4. Panagiotis Michailidis & Paschalis Pelitaris & Christos Korkas & Iakovos Michailidis & Simone Baldi & Elias Kosmatopoulos, 2021. "Enabling Optimal Energy Management with Minimal IoT Requirements: A Legacy A/C Case Study," Energies, MDPI, vol. 14(23), pages 1-25, November.
    5. Pinto, Giuseppe & Deltetto, Davide & Capozzoli, Alfonso, 2021. "Data-driven district energy management with surrogate models and deep reinforcement learning," Applied Energy, Elsevier, vol. 304(C).
    6. Jerry L. Holechek & Hatim M. E. Geli & Mohammed N. Sawalhah & Raul Valdez, 2022. "A Global Assessment: Can Renewable Energy Replace Fossil Fuels by 2050?," Sustainability, MDPI, vol. 14(8), pages 1-22, April.
    7. George E. Halkos & Eleni-Christina Gkampoura, 2020. "Reviewing Usage, Potentials, and Limitations of Renewable Energy Sources," Energies, MDPI, vol. 13(11), pages 1-19, June.
    8. Dorokhova, Marina & Martinson, Yann & Ballif, Christophe & Wyrsch, Nicolas, 2021. "Deep reinforcement learning control of electric vehicle charging in the presence of photovoltaic generation," Applied Energy, Elsevier, vol. 301(C).
    9. Jeyhun I. Mikayilov & Shahriyar Mukhtarov & Hasan Dinçer & Serhat Yüksel & Rıdvan Aydın, 2020. "Elasticity Analysis of Fossil Energy Sources for Sustainable Economies: A Case of Gasoline Consumption in Turkey," Energies, MDPI, vol. 13(3), pages 1-15, February.
    10. Iakovos T. Michailidis & Roozbeh Sangi & Panagiotis Michailidis & Thomas Schild & Johannes Fuetterer & Dirk Mueller & Elias B. Kosmatopoulos, 2020. "Balancing Energy Efficiency with Indoor Comfort Using Smart Control Agents: A Simulative Case Study," Energies, MDPI, vol. 13(23), pages 1-28, November.
    11. Ying Ji & Jianhui Wang & Jiacan Xu & Xiaoke Fang & Huaguang Zhang, 2019. "Real-Time Energy Management of a Microgrid Using Deep Reinforcement Learning," Energies, MDPI, vol. 12(12), pages 1-21, June.
    12. Amir Mosavi & Pedram Ghamisi & Yaser Faghan & Puhong Duan, 2020. "Comprehensive Review of Deep Reinforcement Learning Methods and Applications in Economics," Papers 2004.01509, arXiv.org.
    13. Ayas Shaqour & Aya Hagishima, 2022. "Systematic Review on Deep Reinforcement Learning-Based Energy Management for Different Building Types," Energies, MDPI, vol. 15(22), pages 1-27, November.
    14. Jeong, Jaeik & Kim, Hongseok, 2021. "DeepComp: Deep reinforcement learning based renewable energy error compensable forecasting," Applied Energy, Elsevier, vol. 294(C).
    15. Pinto, Giuseppe & Piscitelli, Marco Savino & Vázquez-Canteli, José Ramón & Nagy, Zoltán & Capozzoli, Alfonso, 2021. "Coordinated energy management for a cluster of buildings through deep reinforcement learning," Energy, Elsevier, vol. 229(C).
    16. Kazmi, Hussain & Suykens, Johan & Balint, Attila & Driesen, Johan, 2019. "Multi-agent reinforcement learning for modeling and control of thermostatically controlled loads," Applied Energy, Elsevier, vol. 238(C), pages 1022-1035.
    17. Lu, Renzhi & Hong, Seung Ho & Zhang, Xiongfeng, 2018. "A Dynamic pricing demand response algorithm for smart grid: Reinforcement learning approach," Applied Energy, Elsevier, vol. 220(C), pages 220-230.
    18. Florinda Martins & Carlos Felgueiras & Miroslava Smitkova & Nídia Caetano, 2019. "Analysis of Fossil Fuel Energy Consumption and Environmental Impacts in European Countries," Energies, MDPI, vol. 12(6), pages 1-11, March.
    19. Shafiee, Shahriar & Topal, Erkan, 2009. "When will fossil fuel reserves be diminished?," Energy Policy, Elsevier, vol. 37(1), pages 181-189, January.
    20. Dominković, D.F. & Bačeković, I. & Pedersen, A.S. & Krajačić, G., 2018. "The future of transportation in sustainable energy systems: Opportunities and barriers in a clean energy transition," Renewable and Sustainable Energy Reviews, Elsevier, vol. 82(P2), pages 1823-1838.
    21. Rocchetta, R. & Bellani, L. & Compare, M. & Zio, E. & Patelli, E., 2019. "A reinforcement learning framework for optimal operation and maintenance of power grids," Applied Energy, Elsevier, vol. 241(C), pages 291-301.
    22. Volodymyr Mnih & Koray Kavukcuoglu & David Silver & Andrei A. Rusu & Joel Veness & Marc G. Bellemare & Alex Graves & Martin Riedmiller & Andreas K. Fidjeland & Georg Ostrovski & Stig Petersen & Charle, 2015. "Human-level control through deep reinforcement learning," Nature, Nature, vol. 518(7540), pages 529-533, February.
    23. Kofinas, P. & Doltsinis, S. & Dounis, A.I. & Vouros, G.A., 2017. "A reinforcement learning approach for MPPT control method of photovoltaic sources," Renewable Energy, Elsevier, vol. 108(C), pages 461-473.
    24. Fu, Yangyang & Xu, Shichao & Zhu, Qi & O’Neill, Zheng & Adetola, Veronica, 2023. "How good are learning-based control v.s. model-based control for load shifting? Investigations on a single zone building energy system," Energy, Elsevier, vol. 273(C).
    25. Amir Mosavi & Mohsen Salimi & Sina Faizollahzadeh Ardabili & Timon Rabczuk & Shahaboddin Shamshirband & Annamaria R. Varkonyi-Koczy, 2019. "State of the Art of Machine Learning Models in Energy Systems, a Systematic Review," Energies, MDPI, vol. 12(7), pages 1-42, April.
    26. Brida V. Mbuwir & Frederik Ruelens & Fred Spiessens & Geert Deconinck, 2017. "Battery Energy Management in a Microgrid Using Batch Reinforcement Learning," Energies, MDPI, vol. 10(11), pages 1-19, November.
    27. Jaehyun Lee & Eunjung Lee & Jinho Kim, 2020. "Electric Vehicle Charging and Discharging Algorithm Based on Reinforcement Learning with Data-Driven Approach in Dynamic Pricing Scheme," Energies, MDPI, vol. 13(8), pages 1-18, April.
    28. Liu, Hui & Yu, Chengqing & Wu, Haiping & Duan, Zhu & Yan, Guangxi, 2020. "A new hybrid ensemble deep reinforcement learning model for wind speed short term forecasting," Energy, Elsevier, vol. 202(C).
    29. Kuznetsova, Elizaveta & Li, Yan-Fu & Ruiz, Carlos & Zio, Enrico & Ault, Graham & Bell, Keith, 2013. "Reinforcement learning for microgrid energy management," Energy, Elsevier, vol. 59(C), pages 133-146.
    30. Tuchnitz, Felix & Ebell, Niklas & Schlund, Jonas & Pruckner, Marco, 2021. "Development and Evaluation of a Smart Charging Strategy for an Electric Vehicle Fleet Based on Reinforcement Learning," Applied Energy, Elsevier, vol. 285(C).
    31. Du, Yan & Zandi, Helia & Kotevska, Olivera & Kurte, Kuldeep & Munk, Jeffery & Amasyali, Kadir & Mckee, Evan & Li, Fangxing, 2021. "Intelligent multi-zone residential HVAC control strategy based on deep reinforcement learning," Applied Energy, Elsevier, vol. 281(C).
    32. Aitor Saenz-Aguirre & Ekaitz Zulueta & Unai Fernandez-Gamiz & Javier Lozano & Jose Manuel Lopez-Guede, 2019. "Artificial Neural Network Based Reinforcement Learning for Wind Turbine Yaw Control," Energies, MDPI, vol. 12(3), pages 1-17, January.
    33. Guo, Chenyu & Wang, Xin & Zheng, Yihui & Zhang, Feng, 2022. "Real-time optimal energy management of microgrid with uncertainties based on deep reinforcement learning," Energy, Elsevier, vol. 238(PC).
    34. Mosavi, Amir & Faghan, Yaser & Ghamisi, Pedram & Duan, Puhong & Ardabili, Sina Faizollahzadeh & Hassan, Salwana & Band, Shahab S., 2020. "Comprehensive Review of Deep Reinforcement Learning Methods and Applications in Economics," OSF Preprints jrc58, Center for Open Science.
    35. Amirhosein Mosavi & Yaser Faghan & Pedram Ghamisi & Puhong Duan & Sina Faizollahzadeh Ardabili & Ely Salwana & Shahab S. Band, 2020. "Comprehensive Review of Deep Reinforcement Learning Methods and Applications in Economics," Mathematics, MDPI, vol. 8(10), pages 1-42, September.
    36. Michailidis, Iakovos T. & Schild, Thomas & Sangi, Roozbeh & Michailidis, Panagiotis & Korkas, Christos & Fütterer, Johannes & Müller, Dirk & Kosmatopoulos, Elias B., 2018. "Energy-efficient HVAC management using cooperative, self-trained, control agents: A real-life German building case study," Applied Energy, Elsevier, vol. 211(C), pages 113-125.
    37. Younes Zahraoui & Mohammed Reyasudin Basir Khan & Ibrahim AlHamrouni & Saad Mekhilef & Mahrous Ahmed, 2021. "Current Status, Scenario, and Prospective of Renewable Energy in Algeria: A Review," Energies, MDPI, vol. 14(9), pages 1-28, April.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Panagiotis Michailidis & Iakovos Michailidis & Dimitrios Vamvakas & Elias Kosmatopoulos, 2023. "Model-Free HVAC Control in Buildings: A Review," Energies, MDPI, vol. 16(20), pages 1-45, October.
    2. Charalampos Rafail Lazaridis & Iakovos Michailidis & Georgios Karatzinis & Panagiotis Michailidis & Elias Kosmatopoulos, 2024. "Evaluating Reinforcement Learning Algorithms in Residential Energy Saving and Comfort Management," Energies, MDPI, vol. 17(3), pages 1-33, January.
    3. Panagiotis Michailidis & Iakovos Michailidis & Socratis Gkelios & Elias Kosmatopoulos, 2024. "Artificial Neural Network Applications for Energy Management in Buildings: Current Trends and Future Directions," Energies, MDPI, vol. 17(3), pages 1-47, January.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Omar Al-Ani & Sanjoy Das, 2022. "Reinforcement Learning: Theory and Applications in HEMS," Energies, MDPI, vol. 15(17), pages 1-37, September.
    2. Perera, A.T.D. & Kamalaruban, Parameswaran, 2021. "Applications of reinforcement learning in energy systems," Renewable and Sustainable Energy Reviews, Elsevier, vol. 137(C).
    3. Lilia Tightiz & Joon Yoo, 2022. "A Review on a Data-Driven Microgrid Management System Integrating an Active Distribution Network: Challenges, Issues, and New Trends," Energies, MDPI, vol. 15(22), pages 1-24, November.
    4. Qiu, Dawei & Wang, Yi & Hua, Weiqi & Strbac, Goran, 2023. "Reinforcement learning for electric vehicle applications in power systems:A critical review," Renewable and Sustainable Energy Reviews, Elsevier, vol. 173(C).
    5. Charalampos Rafail Lazaridis & Iakovos Michailidis & Georgios Karatzinis & Panagiotis Michailidis & Elias Kosmatopoulos, 2024. "Evaluating Reinforcement Learning Algorithms in Residential Energy Saving and Comfort Management," Energies, MDPI, vol. 17(3), pages 1-33, January.
    6. Zhu, Ziqing & Hu, Ze & Chan, Ka Wing & Bu, Siqi & Zhou, Bin & Xia, Shiwei, 2023. "Reinforcement learning in deregulated energy market: A comprehensive review," Applied Energy, Elsevier, vol. 329(C).
    7. Coraci, Davide & Brandi, Silvio & Hong, Tianzhen & Capozzoli, Alfonso, 2023. "Online transfer learning strategy for enhancing the scalability and deployment of deep reinforcement learning control in smart buildings," Applied Energy, Elsevier, vol. 333(C).
    8. Panagiotis Michailidis & Iakovos Michailidis & Dimitrios Vamvakas & Elias Kosmatopoulos, 2023. "Model-Free HVAC Control in Buildings: A Review," Energies, MDPI, vol. 16(20), pages 1-45, October.
    9. Brini, Alessio & Tedeschi, Gabriele & Tantari, Daniele, 2023. "Reinforcement learning policy recommendation for interbank network stability," Journal of Financial Stability, Elsevier, vol. 67(C).
    10. Pinto, Giuseppe & Kathirgamanathan, Anjukan & Mangina, Eleni & Finn, Donal P. & Capozzoli, Alfonso, 2022. "Enhancing energy management in grid-interactive buildings: A comparison among cooperative and coordinated architectures," Applied Energy, Elsevier, vol. 310(C).
    11. Zheng, Lingwei & Wu, Hao & Guo, Siqi & Sun, Xinyu, 2023. "Real-time dispatch of an integrated energy system based on multi-stage reinforcement learning with an improved action-choosing strategy," Energy, Elsevier, vol. 277(C).
    12. Grace Muriithi & Sunetra Chowdhury, 2021. "Optimal Energy Management of a Grid-Tied Solar PV-Battery Microgrid: A Reinforcement Learning Approach," Energies, MDPI, vol. 14(9), pages 1-24, May.
    13. Song, Yuguang & Xia, Mingchao & Chen, Qifang & Chen, Fangjian, 2023. "A data-model fusion dispatch strategy for the building energy flexibility based on the digital twin," Applied Energy, Elsevier, vol. 332(C).
    14. Tsianikas, Stamatis & Yousefi, Nooshin & Zhou, Jian & Rodgers, Mark D. & Coit, David, 2021. "A storage expansion planning framework using reinforcement learning and simulation-based optimization," Applied Energy, Elsevier, vol. 290(C).
    15. Tian Zhu & Wei Zhu, 2022. "Quantitative Trading through Random Perturbation Q-Network with Nonlinear Transaction Costs," Stats, MDPI, vol. 5(2), pages 1-15, June.
    16. Guo, Chenyu & Wang, Xin & Zheng, Yihui & Zhang, Feng, 2022. "Real-time optimal energy management of microgrid with uncertainties based on deep reinforcement learning," Energy, Elsevier, vol. 238(PC).
    17. Ahmed Ismail & Mustafa Baysal, 2023. "Dynamic Pricing Based on Demand Response Using Actor–Critic Agent Reinforcement Learning," Energies, MDPI, vol. 16(14), pages 1-19, July.
    18. Fatemehsadat Mirshafiee & Emad Shahbazi & Mohadeseh Safi & Rituraj Rituraj, 2023. "Predicting Power and Hydrogen Generation of a Renewable Energy Converter Utilizing Data-Driven Methods: A Sustainable Smart Grid Case Study," Energies, MDPI, vol. 16(1), pages 1-20, January.
    19. Soleimanzade, Mohammad Amin & Kumar, Amit & Sadrzadeh, Mohtada, 2022. "Novel data-driven energy management of a hybrid photovoltaic-reverse osmosis desalination system using deep reinforcement learning," Applied Energy, Elsevier, vol. 317(C).
    20. Nyong-Bassey, Bassey Etim & Giaouris, Damian & Patsios, Charalampos & Papadopoulou, Simira & Papadopoulos, Athanasios I. & Walker, Sara & Voutetakis, Spyros & Seferlis, Panos & Gadoue, Shady, 2020. "Reinforcement learning based adaptive power pinch analysis for energy management of stand-alone hybrid energy storage systems considering uncertainty," Energy, Elsevier, vol. 193(C).

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:gam:jeners:v:16:y:2023:i:14:p:5326-:d:1192246. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: MDPI Indexing Manager (email available below). General contact details of provider: https://www.mdpi.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.