Application-oriented assessment of grid-connected PV-battery system with deep reinforcement learning in buildings considering electricity price dynamics
Author
Abstract
Suggested Citation
DOI: 10.1016/j.apenergy.2024.123163
Download full text from publisher
As the access to this document is restricted, you may want to search for a different version of it.
References listed on IDEAS
- Yang, Hongxing & Wei, Zhou & Chengzhi, Lou, 2009. "Optimal design and techno-economic analysis of a hybrid solar-wind power generation system," Applied Energy, Elsevier, vol. 86(2), pages 163-169, February.
- Raviv, Eran & Bouwman, Kees E. & van Dijk, Dick, 2015.
"Forecasting day-ahead electricity prices: Utilizing hourly prices,"
Energy Economics, Elsevier, vol. 50(C), pages 227-239.
- Eran Raviv & Kees E. Bouwman & Dick van Dijk, 2013. "Forecasting Day-Ahead Electricity Prices: Utilizing Hourly Prices," Tinbergen Institute Discussion Papers 13-068/III, Tinbergen Institute.
- Park, Jong-Whi & Ju, Young-Min & Kim, You-Gwon & Kim, Hak-Sung, 2023. "50% reduction in energy consumption in an actual cold storage facility using a deep reinforcement learning-based control algorithm," Applied Energy, Elsevier, vol. 352(C).
- Gao, Yuan & Matsunami, Yuki & Miyata, Shohei & Akashi, Yasunori, 2022. "Operational optimization for off-grid renewable building energy system using deep reinforcement learning," Applied Energy, Elsevier, vol. 325(C).
- Joanna Clarke & Justin Searle, 2021. "Active Building demonstrators for a low-carbon future," Nature Energy, Nature, vol. 6(12), pages 1087-1089, December.
- Kim, Donghun & Wang, Zhe & Brugger, James & Blum, David & Wetter, Michael & Hong, Tianzhen & Piette, Mary Ann, 2022. "Site demonstration and performance evaluation of MPC for a large chiller plant with TES for renewable energy integration and grid decarbonization," Applied Energy, Elsevier, vol. 321(C).
- Yu Qian Ang & Zachary Michael Berzolla & Samuel Letellier-Duchesne & Christoph F. Reinhart, 2023. "Carbon reduction technology pathways for existing buildings in eight cities," Nature Communications, Nature, vol. 14(1), pages 1-16, December.
- Yin, WanJun & Wen, Tao & Zhang, Chao, 2023. "Cooperative optimal scheduling strategy of electric vehicles based on dynamic electricity price mechanism," Energy, Elsevier, vol. 263(PA).
- Shi, Tao & Xu, Chang & Dong, Wenhao & Zhou, Hangyu & Bokhari, Awais & Klemeš, Jiří Jaromír & Han, Ning, 2023. "Research on energy management of hydrogen electric coupling system based on deep reinforcement learning," Energy, Elsevier, vol. 282(C).
- He, Gang & Kammen, Daniel M., 2016. "Where, when and how much solar is available? A provincial-scale solar resource assessment for China," Renewable Energy, Elsevier, vol. 85(C), pages 74-82.
- Salpakari, Jyri & Lund, Peter, 2016. "Optimal and rule-based control strategies for energy flexibility in buildings with PV," Applied Energy, Elsevier, vol. 161(C), pages 425-436.
- Edward A. Byers & Gemma Coxon & Jim Freer & Jim W. Hall, 2020. "Drought and climate change impacts on cooling water shortages and electricity prices in Great Britain," Nature Communications, Nature, vol. 11(1), pages 1-12, December.
- Kang, Hyuna & Jung, Seunghoon & Kim, Hakpyeong & Jeoung, Jaewon & Hong, Taehoon, 2024. "Reinforcement learning-based optimal scheduling model of battery energy storage system at the building level," Renewable and Sustainable Energy Reviews, Elsevier, vol. 190(PA).
- Kang, Dongju & Kang, Doeun & Hwangbo, Sumin & Niaz, Haider & Lee, Won Bo & Liu, J. Jay & Na, Jonggeol, 2023. "Optimal planning of hybrid energy storage systems using curtailed renewable energy through deep reinforcement learning," Energy, Elsevier, vol. 284(C).
- Volodymyr Mnih & Koray Kavukcuoglu & David Silver & Andrei A. Rusu & Joel Veness & Marc G. Bellemare & Alex Graves & Martin Riedmiller & Andreas K. Fidjeland & Georg Ostrovski & Stig Petersen & Charle, 2015. "Human-level control through deep reinforcement learning," Nature, Nature, vol. 518(7540), pages 529-533, February.
- Kuang, Zhonghong & Chen, Qi & Yu, Yang, 2022. "Assessing the CO2-emission risk due to wind-energy uncertainty," Applied Energy, Elsevier, vol. 310(C).
- Iain Staffell & Stefan Pfenninger & Nathan Johnson, 2023. "A global model of hourly space heating and cooling demand at multiple spatial scales," Nature Energy, Nature, vol. 8(12), pages 1328-1344, December.
- Chen, Qi & Kuang, Zhonghong & Liu, Xiaohua & Zhang, Tao, 2022. "Energy storage to solve the diurnal, weekly, and seasonal mismatch and achieve zero-carbon electricity consumption in buildings," Applied Energy, Elsevier, vol. 312(C).
- Ren, Kezheng & Liu, Jun & Wu, Zeyang & Liu, Xinglei & Nie, Yongxin & Xu, Haitao, 2024. "A data-driven DRL-based home energy management system optimization framework considering uncertain household parameters," Applied Energy, Elsevier, vol. 355(C).
- Chen, Qi & Kuang, Zhonghong & Liu, Xiaohua & Zhang, Tao, 2024. "Optimal sizing and techno-economic analysis of the hybrid PV-battery-cooling storage system for commercial buildings in China," Applied Energy, Elsevier, vol. 355(C).
Citations
Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
Cited by:
- Krystian Janusz Cieślak, 2024. "Profitability Analysis of a Prosumer Photovoltaic Installation in Light of Changing Electricity Billing Regulations in Poland," Energies, MDPI, vol. 17(15), pages 1-16, July.
Most related items
These are the items that most often cite the same works as this one and are cited by the same works as this one.- Chen, Qi & Kuang, Zhonghong & Liu, Xiaohua & Zhang, Tao, 2024. "Optimal sizing and techno-economic analysis of the hybrid PV-battery-cooling storage system for commercial buildings in China," Applied Energy, Elsevier, vol. 355(C).
- Lu, Yu & Xiang, Yue & Huang, Yuan & Yu, Bin & Weng, Liguo & Liu, Junyong, 2023. "Deep reinforcement learning based optimal scheduling of active distribution system considering distributed generation, energy storage and flexible load," Energy, Elsevier, vol. 271(C).
- Zhang, Tianhao & Dong, Zhe & Huang, Xiaojin, 2024. "Multi-objective optimization of thermal power and outlet steam temperature for a nuclear steam supply system with deep reinforcement learning," Energy, Elsevier, vol. 286(C).
- Li, Yanxue & Wang, Zixuan & Xu, Wenya & Gao, Weijun & Xu, Yang & Xiao, Fu, 2023. "Modeling and energy dynamic control for a ZEH via hybrid model-based deep reinforcement learning," Energy, Elsevier, vol. 277(C).
- Wang, Hao & Chen, Xiwen & Vital, Natan & Duffy, Edward & Razi, Abolfazl, 2024. "Energy optimization for HVAC systems in multi-VAV open offices: A deep reinforcement learning approach," Applied Energy, Elsevier, vol. 356(C).
- Yan Yang & Qingyu Wei & Shanke Liu & Liang Zhao, 2022. "Distribution Strategy Optimization of Standalone Hybrid WT/PV System Based on Different Solar and Wind Resources for Rural Applications," Energies, MDPI, vol. 15(14), pages 1-21, July.
- Wenya Xu & Yanxue Li & Guanjie He & Yang Xu & Weijun Gao, 2023. "Performance Assessment and Comparative Analysis of Photovoltaic-Battery System Scheduling in an Existing Zero-Energy House Based on Reinforcement Learning Control," Energies, MDPI, vol. 16(13), pages 1-19, June.
- Han, Gwangwoo & Joo, Hong-Jin & Lim, Hee-Won & An, Young-Sub & Lee, Wang-Je & Lee, Kyoung-Ho, 2023. "Data-driven heat pump operation strategy using rainbow deep reinforcement learning for significant reduction of electricity cost," Energy, Elsevier, vol. 270(C).
- Gao, Yuan & Matsunami, Yuki & Miyata, Shohei & Akashi, Yasunori, 2022. "Multi-agent reinforcement learning dealing with hybrid action spaces: A case study for off-grid oriented renewable building energy system," Applied Energy, Elsevier, vol. 326(C).
- Tulika Saha & Sriparna Saha & Pushpak Bhattacharyya, 2020. "Towards sentiment aided dialogue policy learning for multi-intent conversations using hierarchical reinforcement learning," PLOS ONE, Public Library of Science, vol. 15(7), pages 1-28, July.
- Mahmoud Mahfouz & Angelos Filos & Cyrine Chtourou & Joshua Lockhart & Samuel Assefa & Manuela Veloso & Danilo Mandic & Tucker Balch, 2019. "On the Importance of Opponent Modeling in Auction Markets," Papers 1911.12816, arXiv.org.
- Jun Maekawa & Koji Shimada, 2019. "A Speculative Trading Model for the Electricity Market: Based on Japan Electric Power Exchange," Energies, MDPI, vol. 12(15), pages 1-15, July.
- Liu, Hailiang & Andresen, Gorm Bruun & Greiner, Martin, 2018. "Cost-optimal design of a simplified highly renewable Chinese electricity network," Energy, Elsevier, vol. 147(C), pages 534-546.
- Woo Jae Byun & Bumkyu Choi & Seongmin Kim & Joohyun Jo, 2023. "Practical Application of Deep Reinforcement Learning to Optimal Trade Execution," FinTech, MDPI, vol. 2(3), pages 1-16, June.
- Yuhong Wang & Lei Chen & Hong Zhou & Xu Zhou & Zongsheng Zheng & Qi Zeng & Li Jiang & Liang Lu, 2021. "Flexible Transmission Network Expansion Planning Based on DQN Algorithm," Energies, MDPI, vol. 14(7), pages 1-21, April.
- repec:bny:wpaper:0088 is not listed on IDEAS
- Jing-Li Fan & Zezheng Li & Xi Huang & Kai Li & Xian Zhang & Xi Lu & Jianzhong Wu & Klaus Hubacek & Bo Shen, 2023. "A net-zero emissions strategy for China’s power sector using carbon-capture utilization and storage," Nature Communications, Nature, vol. 14(1), pages 1-16, December.
- Shangfeng Han & Baosheng Zhang & Xiaoyang Sun & Song Han & Mikael Höök, 2017. "China’s Energy Transition in the Power and Transport Sectors from a Substitution Perspective," Energies, MDPI, vol. 10(5), pages 1-25, April.
- Michelle M. LaMar, 2018. "Markov Decision Process Measurement Model," Psychometrika, Springer;The Psychometric Society, vol. 83(1), pages 67-88, March.
- Yang, Ting & Zhao, Liyuan & Li, Wei & Zomaya, Albert Y., 2021. "Dynamic energy dispatch strategy for integrated energy system based on improved deep reinforcement learning," Energy, Elsevier, vol. 235(C).
- Wang, Yi & Qiu, Dawei & Sun, Mingyang & Strbac, Goran & Gao, Zhiwei, 2023. "Secure energy management of multi-energy microgrid: A physical-informed safe reinforcement learning approach," Applied Energy, Elsevier, vol. 335(C).
More about this item
Keywords
Deep reinforcement learning; PV-battery system; Cost savings; Factor analysis; Application potential;All these keywords.
Statistics
Access and download statisticsCorrections
All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:appene:v:364:y:2024:i:c:s0306261924005464. See general information about how to correct material in RePEc.
If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.
If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .
If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/wps/find/journaldescription.cws_home/405891/description#description .
Please note that corrections may take a couple of weeks to filter through the various RePEc services.