IDEAS home Printed from https://ideas.repec.org/a/eee/appene/v364y2024ics0306261924005464.html
   My bibliography  Save this article

Application-oriented assessment of grid-connected PV-battery system with deep reinforcement learning in buildings considering electricity price dynamics

Author

Listed:
  • Chen, Qi
  • Kuang, Zhonghong
  • Liu, Xiaohua
  • Zhang, Tao

Abstract

Deep reinforcement learning (DRL) is decisive in addressing uncertainties in intelligent grid-building interactions. Using DRL algorithms, this research optimizes the operational strategy of the building's grid-connected photovoltaic-battery (PV-battery) system, and examines the economic impact of battery capacity, rooftop PV penetration, and electricity price volatility. Three algorithms are employed, each demonstrating remarkable superiority over rule-based control. Without rooftop PV, the rule-based control achieves the battery cost saving of 0.07 RMB/(d·kWh) with a capacity equal to the average building load, while the three algorithms showcase a more substantial range of 0.17–0.19 RMB/(d·kWh). The cooperation of PV introduces heightened intricacy to the DRL training process. Incorporating PV radiation information into the state space remarkably amplifies the battery's capability to consume surplus PV, thereby enhancing economic benefits within the DRL strategy. Consequently, the battery attains cost savings of approximately 0.46 RMB/(d·kWh) under 50% PV penetration. Finally, the study reveals that as electricity price volatility intensifies, the advantage of DRL becomes more conspicuous. As grid renewable penetration progresses from 24% to 50%, the superiority of DRL over rule-based control in battery's cost savings escalates from 0.11 to 0.17 RMB/(d·kWh).

Suggested Citation

  • Chen, Qi & Kuang, Zhonghong & Liu, Xiaohua & Zhang, Tao, 2024. "Application-oriented assessment of grid-connected PV-battery system with deep reinforcement learning in buildings considering electricity price dynamics," Applied Energy, Elsevier, vol. 364(C).
  • Handle: RePEc:eee:appene:v:364:y:2024:i:c:s0306261924005464
    DOI: 10.1016/j.apenergy.2024.123163
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0306261924005464
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.apenergy.2024.123163?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Joanna Clarke & Justin Searle, 2021. "Active Building demonstrators for a low-carbon future," Nature Energy, Nature, vol. 6(12), pages 1087-1089, December.
    2. Kim, Donghun & Wang, Zhe & Brugger, James & Blum, David & Wetter, Michael & Hong, Tianzhen & Piette, Mary Ann, 2022. "Site demonstration and performance evaluation of MPC for a large chiller plant with TES for renewable energy integration and grid decarbonization," Applied Energy, Elsevier, vol. 321(C).
    3. Raviv, Eran & Bouwman, Kees E. & van Dijk, Dick, 2015. "Forecasting day-ahead electricity prices: Utilizing hourly prices," Energy Economics, Elsevier, vol. 50(C), pages 227-239.
    4. He, Gang & Kammen, Daniel M., 2016. "Where, when and how much solar is available? A provincial-scale solar resource assessment for China," Renewable Energy, Elsevier, vol. 85(C), pages 74-82.
    5. Edward A. Byers & Gemma Coxon & Jim Freer & Jim W. Hall, 2020. "Drought and climate change impacts on cooling water shortages and electricity prices in Great Britain," Nature Communications, Nature, vol. 11(1), pages 1-12, December.
    6. Kang, Hyuna & Jung, Seunghoon & Kim, Hakpyeong & Jeoung, Jaewon & Hong, Taehoon, 2024. "Reinforcement learning-based optimal scheduling model of battery energy storage system at the building level," Renewable and Sustainable Energy Reviews, Elsevier, vol. 190(PA).
    7. Volodymyr Mnih & Koray Kavukcuoglu & David Silver & Andrei A. Rusu & Joel Veness & Marc G. Bellemare & Alex Graves & Martin Riedmiller & Andreas K. Fidjeland & Georg Ostrovski & Stig Petersen & Charle, 2015. "Human-level control through deep reinforcement learning," Nature, Nature, vol. 518(7540), pages 529-533, February.
    8. Kuang, Zhonghong & Chen, Qi & Yu, Yang, 2022. "Assessing the CO2-emission risk due to wind-energy uncertainty," Applied Energy, Elsevier, vol. 310(C).
    9. Iain Staffell & Stefan Pfenninger & Nathan Johnson, 2023. "A global model of hourly space heating and cooling demand at multiple spatial scales," Nature Energy, Nature, vol. 8(12), pages 1328-1344, December.
    10. Chen, Qi & Kuang, Zhonghong & Liu, Xiaohua & Zhang, Tao, 2024. "Optimal sizing and techno-economic analysis of the hybrid PV-battery-cooling storage system for commercial buildings in China," Applied Energy, Elsevier, vol. 355(C).
    11. Yang, Hongxing & Wei, Zhou & Chengzhi, Lou, 2009. "Optimal design and techno-economic analysis of a hybrid solar-wind power generation system," Applied Energy, Elsevier, vol. 86(2), pages 163-169, February.
    12. Park, Jong-Whi & Ju, Young-Min & Kim, You-Gwon & Kim, Hak-Sung, 2023. "50% reduction in energy consumption in an actual cold storage facility using a deep reinforcement learning-based control algorithm," Applied Energy, Elsevier, vol. 352(C).
    13. Gao, Yuan & Matsunami, Yuki & Miyata, Shohei & Akashi, Yasunori, 2022. "Operational optimization for off-grid renewable building energy system using deep reinforcement learning," Applied Energy, Elsevier, vol. 325(C).
    14. Yu Qian Ang & Zachary Michael Berzolla & Samuel Letellier-Duchesne & Christoph F. Reinhart, 2023. "Carbon reduction technology pathways for existing buildings in eight cities," Nature Communications, Nature, vol. 14(1), pages 1-16, December.
    15. Yin, WanJun & Wen, Tao & Zhang, Chao, 2023. "Cooperative optimal scheduling strategy of electric vehicles based on dynamic electricity price mechanism," Energy, Elsevier, vol. 263(PA).
    16. Shi, Tao & Xu, Chang & Dong, Wenhao & Zhou, Hangyu & Bokhari, Awais & Klemeš, Jiří Jaromír & Han, Ning, 2023. "Research on energy management of hydrogen electric coupling system based on deep reinforcement learning," Energy, Elsevier, vol. 282(C).
    17. Salpakari, Jyri & Lund, Peter, 2016. "Optimal and rule-based control strategies for energy flexibility in buildings with PV," Applied Energy, Elsevier, vol. 161(C), pages 425-436.
    18. Kang, Dongju & Kang, Doeun & Hwangbo, Sumin & Niaz, Haider & Lee, Won Bo & Liu, J. Jay & Na, Jonggeol, 2023. "Optimal planning of hybrid energy storage systems using curtailed renewable energy through deep reinforcement learning," Energy, Elsevier, vol. 284(C).
    19. Chen, Qi & Kuang, Zhonghong & Liu, Xiaohua & Zhang, Tao, 2022. "Energy storage to solve the diurnal, weekly, and seasonal mismatch and achieve zero-carbon electricity consumption in buildings," Applied Energy, Elsevier, vol. 312(C).
    20. Ren, Kezheng & Liu, Jun & Wu, Zeyang & Liu, Xinglei & Nie, Yongxin & Xu, Haitao, 2024. "A data-driven DRL-based home energy management system optimization framework considering uncertain household parameters," Applied Energy, Elsevier, vol. 355(C).
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Krystian Janusz Cieślak, 2024. "Profitability Analysis of a Prosumer Photovoltaic Installation in Light of Changing Electricity Billing Regulations in Poland," Energies, MDPI, vol. 17(15), pages 1-16, July.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Chen, Qi & Kuang, Zhonghong & Liu, Xiaohua & Zhang, Tao, 2024. "Optimal sizing and techno-economic analysis of the hybrid PV-battery-cooling storage system for commercial buildings in China," Applied Energy, Elsevier, vol. 355(C).
    2. Lu, Yu & Xiang, Yue & Huang, Yuan & Yu, Bin & Weng, Liguo & Liu, Junyong, 2023. "Deep reinforcement learning based optimal scheduling of active distribution system considering distributed generation, energy storage and flexible load," Energy, Elsevier, vol. 271(C).
    3. Zhang, Tianhao & Dong, Zhe & Huang, Xiaojin, 2024. "Multi-objective optimization of thermal power and outlet steam temperature for a nuclear steam supply system with deep reinforcement learning," Energy, Elsevier, vol. 286(C).
    4. Li, Yanxue & Wang, Zixuan & Xu, Wenya & Gao, Weijun & Xu, Yang & Xiao, Fu, 2023. "Modeling and energy dynamic control for a ZEH via hybrid model-based deep reinforcement learning," Energy, Elsevier, vol. 277(C).
    5. Wang, Hao & Chen, Xiwen & Vital, Natan & Duffy, Edward & Razi, Abolfazl, 2024. "Energy optimization for HVAC systems in multi-VAV open offices: A deep reinforcement learning approach," Applied Energy, Elsevier, vol. 356(C).
    6. Wenya Xu & Yanxue Li & Guanjie He & Yang Xu & Weijun Gao, 2023. "Performance Assessment and Comparative Analysis of Photovoltaic-Battery System Scheduling in an Existing Zero-Energy House Based on Reinforcement Learning Control," Energies, MDPI, vol. 16(13), pages 1-19, June.
    7. Han, Gwangwoo & Joo, Hong-Jin & Lim, Hee-Won & An, Young-Sub & Lee, Wang-Je & Lee, Kyoung-Ho, 2023. "Data-driven heat pump operation strategy using rainbow deep reinforcement learning for significant reduction of electricity cost," Energy, Elsevier, vol. 270(C).
    8. Gao, Yuan & Matsunami, Yuki & Miyata, Shohei & Akashi, Yasunori, 2022. "Multi-agent reinforcement learning dealing with hybrid action spaces: A case study for off-grid oriented renewable building energy system," Applied Energy, Elsevier, vol. 326(C).
    9. Yan Yang & Qingyu Wei & Shanke Liu & Liang Zhao, 2022. "Distribution Strategy Optimization of Standalone Hybrid WT/PV System Based on Different Solar and Wind Resources for Rural Applications," Energies, MDPI, vol. 15(14), pages 1-21, July.
    10. Tulika Saha & Sriparna Saha & Pushpak Bhattacharyya, 2020. "Towards sentiment aided dialogue policy learning for multi-intent conversations using hierarchical reinforcement learning," PLOS ONE, Public Library of Science, vol. 15(7), pages 1-28, July.
    11. Singh, Bharat & Kumar, Ashwani, 2023. "Optimal energy management and feasibility analysis of hybrid renewable energy sources with BESS and impact of electric vehicle load with demand response program," Energy, Elsevier, vol. 278(PA).
    12. Marchetti, Isabella & Rego, Erik Eduardo, 2022. "The impact of hourly pricing for renewable generation projects in Brazil," Renewable Energy, Elsevier, vol. 189(C), pages 601-617.
    13. Zhang, Haoran & Li, Ruixiong & Cai, Xingrui & Zheng, Chaoyue & Liu, Laibao & Liu, Maodian & Zhang, Qianru & Lin, Huiming & Chen, Long & Wang, Xuejun, 2022. "Do electricity flows hamper regional economic–environmental equity?," Applied Energy, Elsevier, vol. 326(C).
    14. Mahmoud Mahfouz & Angelos Filos & Cyrine Chtourou & Joshua Lockhart & Samuel Assefa & Manuela Veloso & Danilo Mandic & Tucker Balch, 2019. "On the Importance of Opponent Modeling in Auction Markets," Papers 1911.12816, arXiv.org.
    15. Imen Azzouz & Wiem Fekih Hassen, 2023. "Optimization of Electric Vehicles Charging Scheduling Based on Deep Reinforcement Learning: A Decentralized Approach," Energies, MDPI, vol. 16(24), pages 1-18, December.
    16. Sinha, Pankaj & Mathur, Kritika, 2016. "Empirical Analysis of Developments in the Day Ahead Electricity Markets in India," MPRA Paper 72969, University Library of Munich, Germany.
    17. Jun Maekawa & Koji Shimada, 2019. "A Speculative Trading Model for the Electricity Market: Based on Japan Electric Power Exchange," Energies, MDPI, vol. 12(15), pages 1-15, July.
    18. Jacob W. Crandall & Mayada Oudah & Tennom & Fatimah Ishowo-Oloko & Sherief Abdallah & Jean-François Bonnefon & Manuel Cebrian & Azim Shariff & Michael A. Goodrich & Iyad Rahwan, 2018. "Cooperating with machines," Nature Communications, Nature, vol. 9(1), pages 1-12, December.
      • Abdallah, Sherief & Bonnefon, Jean-François & Cebrian, Manuel & Crandall, Jacob W. & Ishowo-Oloko, Fatimah & Oudah, Mayada & Rahwan, Iyad & Shariff, Azim & Tennom,, 2017. "Cooperating with Machines," TSE Working Papers 17-806, Toulouse School of Economics (TSE).
      • Abdallah, Sherief & Bonnefon, Jean-François & Cebrian, Manuel & Crandall, Jacob W. & Ishowo-Oloko, Fatimah & Oudah, Mayada & Rahwan, Iyad & Shariff, Azim & Tennom,, 2017. "Cooperating with Machines," IAST Working Papers 17-68, Institute for Advanced Study in Toulouse (IAST).
      • Jacob Crandall & Mayada Oudah & Fatimah Ishowo-Oloko Tennom & Fatimah Ishowo-Oloko & Sherief Abdallah & Jean-François Bonnefon & Manuel Cebrian & Azim Shariff & Michael Goodrich & Iyad Rahwan, 2018. "Cooperating with machines," Post-Print hal-01897802, HAL.
    19. Sun, Alexander Y., 2020. "Optimal carbon storage reservoir management through deep reinforcement learning," Applied Energy, Elsevier, vol. 278(C).
    20. Liu, Hailiang & Andresen, Gorm Bruun & Greiner, Martin, 2018. "Cost-optimal design of a simplified highly renewable Chinese electricity network," Energy, Elsevier, vol. 147(C), pages 534-546.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:appene:v:364:y:2024:i:c:s0306261924005464. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/wps/find/journaldescription.cws_home/405891/description#description .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.