IDEAS home Printed from https://ideas.repec.org/a/eee/appene/v235y2019icp1072-1089.html
   My bibliography  Save this article

Reinforcement learning for demand response: A review of algorithms and modeling techniques

Author

Listed:
  • Vázquez-Canteli, José R.
  • Nagy, Zoltán

Abstract

Buildings account for about 40% of the global energy consumption. Renewable energy resources are one possibility to mitigate the dependence of residential buildings on the electrical grid. However, their integration into the existing grid infrastructure must be done carefully to avoid instability, and guarantee availability and security of supply. Demand response, or demand-side management, improves grid stability by increasing demand flexibility, and shifts peak demand towards periods of peak renewable energy generation by providing consumers with economic incentives. This paper reviews the use of reinforcement learning, a machine learning algorithm, for demand response applications in the smart grid. Reinforcement learning has been utilized to control diverse energy systems such as electric vehicles, heating ventilation and air conditioning (HVAC) systems, smart appliances, or batteries. The future of demand response greatly depends on its ability to prevent consumer discomfort and integrate human feedback into the control loop. Reinforcement learning is a potentially model-free algorithm that can adapt to its environment, as well as to human preferences by directly integrating user feedback into its control logic. Our review shows that, although many papers consider human comfort and satisfaction, most of them focus on single-agent systems with demand-independent electricity prices and a stationary environment. However, when electricity prices are modelled as demand-dependent variables, there is a risk of shifting the peak demand rather than shaving it. We identify a need to further explore reinforcement learning to coordinate multi-agent systems that can participate in demand response programs under demand-dependent electricity prices. Finally, we discuss directions for future research, e.g., quantifying how RL could adapt to changing urban conditions such as building refurbishment and urban or population growth.

Suggested Citation

  • Vázquez-Canteli, José R. & Nagy, Zoltán, 2019. "Reinforcement learning for demand response: A review of algorithms and modeling techniques," Applied Energy, Elsevier, vol. 235(C), pages 1072-1089.
  • Handle: RePEc:eee:appene:v:235:y:2019:i:c:p:1072-1089
    DOI: 10.1016/j.apenergy.2018.11.002
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0306261918317082
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.apenergy.2018.11.002?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Park, June Young & Nagy, Zoltan, 2018. "Comprehensive analysis of the relationship between thermal comfort and building control research - A data-driven literature review," Renewable and Sustainable Energy Reviews, Elsevier, vol. 82(P3), pages 2664-2679.
    2. Dupont, B. & Dietrich, K. & De Jonghe, C. & Ramos, A. & Belmans, R., 2014. "Impact of residential demand response on power system operation: A Belgian case study," Applied Energy, Elsevier, vol. 122(C), pages 1-10.
    3. Kazmi, H. & D’Oca, S. & Delmastro, C. & Lodeweyckx, S. & Corgnati, S.P., 2016. "Generalizable occupant-driven optimization model for domestic hot water production in NZEB," Applied Energy, Elsevier, vol. 175(C), pages 1-15.
    4. Siano, Pierluigi, 2014. "Demand response and smart grids—A survey," Renewable and Sustainable Energy Reviews, Elsevier, vol. 30(C), pages 461-478.
    5. Venkatesan, Naveen & Solanki, Jignesh & Solanki, Sarika Khushalani, 2012. "Residential Demand Response model and impact on voltage profile and losses of an electric distribution network," Applied Energy, Elsevier, vol. 96(C), pages 84-91.
    6. Kazmi, Hussain & Mehmood, Fahad & Lodeweyckx, Stefan & Driesen, Johan, 2018. "Gigawatt-hour scale savings on a budget of zero: Deep reinforcement learning based optimal control of hot water systems," Energy, Elsevier, vol. 144(C), pages 159-168.
    7. Salehizadeh, Mohammad Reza & Soltaniyan, Salman, 2016. "Application of fuzzy Q-learning for electricity market modeling by considering renewable power penetration," Renewable and Sustainable Energy Reviews, Elsevier, vol. 56(C), pages 1172-1181.
    8. Shariatzadeh, Farshid & Mandal, Paras & Srivastava, Anurag K., 2015. "Demand response for sustainable energy systems: A review, application and implementation strategy," Renewable and Sustainable Energy Reviews, Elsevier, vol. 45(C), pages 343-350.
    9. Frederik Ruelens & Sandro Iacovella & Bert J. Claessens & Ronnie Belmans, 2015. "Learning Agent for a Heat-Pump Thermostat with a Set-Back Strategy Using Model-Free Reinforcement Learning," Energies, MDPI, vol. 8(8), pages 1-19, August.
    10. Kofinas, P. & Dounis, A.I. & Vouros, G.A., 2018. "Fuzzy Q-Learning for multi-agent decentralized energy management in microgrids," Applied Energy, Elsevier, vol. 219(C), pages 53-67.
    11. Zou, Yuan & Liu, Teng & Liu, Dexing & Sun, Fengchun, 2016. "Reinforcement learning-based real-time energy management for a hybrid tracked vehicle," Applied Energy, Elsevier, vol. 171(C), pages 372-382.
    12. Herter, Karen & McAuliffe, Patrick & Rosenfeld, Arthur, 2007. "An exploratory analysis of California residential customer response to critical peak pricing of electricity," Energy, Elsevier, vol. 32(1), pages 25-34.
    13. Zeng, Bo & Wu, Geng & Wang, Jianhui & Zhang, Jianhua & Zeng, Ming, 2017. "Impact of behavior-driven demand response on supply adequacy in smart distribution systems," Applied Energy, Elsevier, vol. 202(C), pages 125-137.
    14. Xiong, Rui & Duan, Yanzhou & Cao, Jiayi & Yu, Quanqing, 2018. "Battery and ultracapacitor in-the-loop approach to validate a real-time power management method for an all-climate electric vehicle," Applied Energy, Elsevier, vol. 217(C), pages 153-165.
    15. David P. Chassin & Jason C. Fuller & Ned Djilali, 2014. "GridLAB-D: An Agent-Based Simulation Framework for Smart Grids," Journal of Applied Mathematics, Hindawi, vol. 2014, pages 1-12, June.
    16. Yang, Lei & Nagy, Zoltan & Goffin, Philippe & Schlueter, Arno, 2015. "Reinforcement learning for optimal control of low exergy buildings," Applied Energy, Elsevier, vol. 156(C), pages 577-586.
    17. Shuxian Li & Minghui Hu & Changchao Gong & Sen Zhan & Datong Qin, 2018. "Energy Management Strategy for Hybrid Electric Vehicle Based on Driving Condition Identification Using KGA-Means," Energies, MDPI, vol. 11(6), pages 1-16, June.
    18. Jiang, C.X. & Jing, Z.X. & Cui, X.R. & Ji, T.Y. & Wu, Q.H., 2018. "Multiple agents and reinforcement learning for modelling charging loads of electric taxis," Applied Energy, Elsevier, vol. 222(C), pages 158-168.
    19. Wu, Jingda & He, Hongwen & Peng, Jiankun & Li, Yuecheng & Li, Zhanjiang, 2018. "Continuous reinforcement learning of energy management with deep Q network for a power split hybrid electric bus," Applied Energy, Elsevier, vol. 222(C), pages 799-811.
    20. Xiong, Rui & Cao, Jiayi & Yu, Quanqing, 2018. "Reinforcement learning-based real-time power management for hybrid energy storage system in the plug-in hybrid electric vehicle," Applied Energy, Elsevier, vol. 211(C), pages 538-548.
    21. Aghaei, Jamshid & Alizadeh, Mohammad-Iman, 2013. "Demand response in smart electricity grids equipped with renewable energy sources: A review," Renewable and Sustainable Energy Reviews, Elsevier, vol. 18(C), pages 64-72.
    22. Wang, Jianxiao & Zhong, Haiwang & Ma, Ziming & Xia, Qing & Kang, Chongqing, 2017. "Review and prospect of integrated demand response in the multi-energy system," Applied Energy, Elsevier, vol. 202(C), pages 772-782.
    23. Brida V. Mbuwir & Frederik Ruelens & Fred Spiessens & Geert Deconinck, 2017. "Battery Energy Management in a Microgrid Using Batch Reinforcement Learning," Energies, MDPI, vol. 10(11), pages 1-19, November.
    24. Zhang, Xiaoshun & Bao, Tao & Yu, Tao & Yang, Bo & Han, Chuanjia, 2017. "Deep transfer Q-learning with virtual leader-follower for supply-demand Stackelberg game of smart grid," Energy, Elsevier, vol. 133(C), pages 348-365.
    25. Dupont, B. & De Jonghe, C. & Olmos, L. & Belmans, R., 2014. "Demand response with locational dynamic pricing to support the integration of renewables," Energy Policy, Elsevier, vol. 67(C), pages 344-354.
    26. Liu, Teng & Wang, Bo & Yang, Chenglang, 2018. "Online Markov Chain-based energy management for a hybrid tracked vehicle with speedy Q-learning," Energy, Elsevier, vol. 160(C), pages 544-555.
    27. Shen, Peihong & Zhao, Zhiguo & Zhan, Xiaowen & Li, Jingwei & Guo, Qiuyi, 2018. "Optimal energy management strategy for a plug-in hybrid electric commercial vehicle based on velocity prediction," Energy, Elsevier, vol. 155(C), pages 838-852.
    28. Nejat, Payam & Jomehzadeh, Fatemeh & Taheri, Mohammad Mahdi & Gohari, Mohammad & Abd. Majid, Muhd Zaimi, 2015. "A global review of energy consumption, CO2 emissions and policy in the residential sector (with an overview of the top ten CO2 emitting countries)," Renewable and Sustainable Energy Reviews, Elsevier, vol. 43(C), pages 843-862.
    29. Leibowicz, Benjamin D. & Lanham, Christopher M. & Brozynski, Max T. & Vázquez-Canteli, José R. & Castejón, Nicolás Castillo & Nagy, Zoltan, 2018. "Optimal decarbonization pathways for urban residential building energy services," Applied Energy, Elsevier, vol. 230(C), pages 1311-1325.
    30. Zehui Kong & Yuan Zou & Teng Liu, 2017. "Implementation of real-time energy management strategy based on reinforcement learning for hybrid electric vehicles and simulation validation," PLOS ONE, Public Library of Science, vol. 12(7), pages 1-16, July.
    31. Dusparic, Ivana & Taylor, Adam & Marinescu, Andrei & Golpayegani, Fatemeh & Clarke, Siobhan, 2017. "Residential demand response: Experimental evaluation and comparison of self-organizing techniques," Renewable and Sustainable Energy Reviews, Elsevier, vol. 80(C), pages 1528-1536.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Perera, A.T.D. & Kamalaruban, Parameswaran, 2021. "Applications of reinforcement learning in energy systems," Renewable and Sustainable Energy Reviews, Elsevier, vol. 137(C).
    2. Wang, Zhe & Hong, Tianzhen, 2020. "Reinforcement learning for building controls: The opportunities and challenges," Applied Energy, Elsevier, vol. 269(C).
    3. Daniel Egan & Qilun Zhu & Robert Prucka, 2023. "A Review of Reinforcement Learning-Based Powertrain Controllers: Effects of Agent Selection for Mixed-Continuity Control and Reward Formulation," Energies, MDPI, vol. 16(8), pages 1-31, April.
    4. Wu, Yuankai & Tan, Huachun & Peng, Jiankun & Zhang, Hailong & He, Hongwen, 2019. "Deep reinforcement learning of energy management with continuous control strategy and traffic information for a series-parallel plug-in hybrid electric bus," Applied Energy, Elsevier, vol. 247(C), pages 454-466.
    5. Dong, Peng & Zhao, Junwei & Liu, Xuewu & Wu, Jian & Xu, Xiangyang & Liu, Yanfang & Wang, Shuhan & Guo, Wei, 2022. "Practical application of energy management strategy for hybrid electric vehicles based on intelligent and connected technologies: Development stages, challenges, and future trends," Renewable and Sustainable Energy Reviews, Elsevier, vol. 170(C).
    6. Ramya Kuppusamy & Srete Nikolovski & Yuvaraja Teekaraman, 2023. "Review of Machine Learning Techniques for Power Quality Performance Evaluation in Grid-Connected Systems," Sustainability, MDPI, vol. 15(20), pages 1-29, October.
    7. McPherson, Madeleine & Stoll, Brady, 2020. "Demand response for variable renewable energy integration: A proposed approach and its impacts," Energy, Elsevier, vol. 197(C).
    8. Yang, Ningkang & Han, Lijin & Xiang, Changle & Liu, Hui & Li, Xunmin, 2021. "An indirect reinforcement learning based real-time energy management strategy via high-order Markov Chain model for a hybrid electric vehicle," Energy, Elsevier, vol. 236(C).
    9. Pinto, Giuseppe & Piscitelli, Marco Savino & Vázquez-Canteli, José Ramón & Nagy, Zoltán & Capozzoli, Alfonso, 2021. "Coordinated energy management for a cluster of buildings through deep reinforcement learning," Energy, Elsevier, vol. 229(C).
    10. Shi, Wenzhuo & Huangfu, Yigeng & Xu, Liangcai & Pang, Shengzhao, 2022. "Online energy management strategy considering fuel cell fault for multi-stack fuel cell hybrid vehicle based on multi-agent reinforcement learning," Applied Energy, Elsevier, vol. 328(C).
    11. Talari, Saber & Shafie-khah, Miadreza & Osório, Gerardo J. & Aghaei, Jamshid & Catalão, João P.S., 2018. "Stochastic modelling of renewable energy sources from operators' point-of-view: A survey," Renewable and Sustainable Energy Reviews, Elsevier, vol. 81(P2), pages 1953-1965.
    12. Yang, Changhui & Meng, Chen & Zhou, Kaile, 2018. "Residential electricity pricing in China: The context of price-based demand response," Renewable and Sustainable Energy Reviews, Elsevier, vol. 81(P2), pages 2870-2878.
    13. Guo, Peiyang & Li, Victor O.K. & Lam, Jacqueline C.K., 2017. "Smart demand response in China: Challenges and drivers," Energy Policy, Elsevier, vol. 107(C), pages 1-10.
    14. Shen, Rendong & Zhong, Shengyuan & Wen, Xin & An, Qingsong & Zheng, Ruifan & Li, Yang & Zhao, Jun, 2022. "Multi-agent deep reinforcement learning optimization framework for building energy system with renewable energy," Applied Energy, Elsevier, vol. 312(C).
    15. Wu, Peng & Partridge, Julius & Bucknall, Richard, 2020. "Cost-effective reinforcement learning energy management for plug-in hybrid fuel cell and battery ships," Applied Energy, Elsevier, vol. 275(C).
    16. Stadler, Michael & Cardoso, Gonçalo & Mashayekh, Salman & Forget, Thibault & DeForest, Nicholas & Agarwal, Ankit & Schönbein, Anna, 2016. "Value streams in microgrids: A literature review," Applied Energy, Elsevier, vol. 162(C), pages 980-989.
    17. Haji Hosseinloo, Ashkan & Ryzhov, Alexander & Bischi, Aldo & Ouerdane, Henni & Turitsyn, Konstantin & Dahleh, Munther A., 2020. "Data-driven control of micro-climate in buildings: An event-triggered reinforcement learning approach," Applied Energy, Elsevier, vol. 277(C).
    18. Liu, Teng & Tan, Wenhao & Tang, Xiaolin & Zhang, Jinwei & Xing, Yang & Cao, Dongpu, 2021. "Driving conditions-driven energy management strategies for hybrid electric vehicles: A review," Renewable and Sustainable Energy Reviews, Elsevier, vol. 151(C).
    19. Li, Weihan & Cui, Han & Nemeth, Thomas & Jansen, Jonathan & Ünlübayir, Cem & Wei, Zhongbao & Feng, Xuning & Han, Xuebing & Ouyang, Minggao & Dai, Haifeng & Wei, Xuezhe & Sauer, Dirk Uwe, 2021. "Cloud-based health-conscious energy management of hybrid battery systems in electric vehicles with deep reinforcement learning," Applied Energy, Elsevier, vol. 293(C).
    20. Nyong-Bassey, Bassey Etim & Giaouris, Damian & Patsios, Charalampos & Papadopoulou, Simira & Papadopoulos, Athanasios I. & Walker, Sara & Voutetakis, Spyros & Seferlis, Panos & Gadoue, Shady, 2020. "Reinforcement learning based adaptive power pinch analysis for energy management of stand-alone hybrid energy storage systems considering uncertainty," Energy, Elsevier, vol. 193(C).

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:appene:v:235:y:2019:i:c:p:1072-1089. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/wps/find/journaldescription.cws_home/405891/description#description .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.