IDEAS home Printed from https://ideas.repec.org/a/eee/appene/v301y2021ics0306261921008874.html
   My bibliography  Save this article

Deep reinforcement learning control of electric vehicle charging in the presence of photovoltaic generation

Author

Listed:
  • Dorokhova, Marina
  • Martinson, Yann
  • Ballif, Christophe
  • Wyrsch, Nicolas

Abstract

In recent years, the importance of electric mobility has increased in response to climate change. The fast-growing deployment of electric vehicles (EVs) worldwide is expected to decrease transportation-related CO2 emissions, facilitate the integration of renewables, and support the grid through demand–response services. Simultaneously, inadequate EV charging patterns can lead to undesirable effects in grid operation, such as high peak-loads or low self-consumption of solar electricity, thus calling for novel methods of control. This work focuses on applying deep reinforcement learning (RL) to the EV charging control problem with the objectives to increase photovoltaic self-consumption and EV state of charge at departure. Particularly, we propose mathematical formulations of environments with discrete, continuous, and parametrized action spaces and respective deep RL algorithms to resolve them. The benchmarking of the deep RL control against naive, rule-based, deterministic optimization, and model-predictive control demonstrates that the suggested methodology can produce consistent and employable EV charging strategies, while its performance holds a great promise for real-time implementations.

Suggested Citation

  • Dorokhova, Marina & Martinson, Yann & Ballif, Christophe & Wyrsch, Nicolas, 2021. "Deep reinforcement learning control of electric vehicle charging in the presence of photovoltaic generation," Applied Energy, Elsevier, vol. 301(C).
  • Handle: RePEc:eee:appene:v:301:y:2021:i:c:s0306261921008874
    DOI: 10.1016/j.apenergy.2021.117504
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0306261921008874
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.apenergy.2021.117504?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Sunyong Kim & Hyuk Lim, 2018. "Reinforcement Learning Based Energy Management Algorithm for Smart Energy Buildings," Energies, MDPI, vol. 11(8), pages 1-19, August.
    2. Kathirgamanathan, Anjukan & De Rosa, Mattia & Mangina, Eleni & Finn, Donal P., 2021. "Data-driven predictive control for unlocking building energy flexibility: A review," Renewable and Sustainable Energy Reviews, Elsevier, vol. 135(C).
    3. Xiaohan Fang & Jinkuan Wang & Guanru Song & Yinghua Han & Qiang Zhao & Zhiao Cao, 2019. "Multi-Agent Reinforcement Learning Approach for Residential Microgrid Energy Scheduling," Energies, MDPI, vol. 13(1), pages 1-26, December.
    4. Luthander, Rasmus & Widén, Joakim & Nilsson, Daniel & Palm, Jenny, 2015. "Photovoltaic self-consumption in buildings: A review," Applied Energy, Elsevier, vol. 142(C), pages 80-94.
    5. Vázquez-Canteli, José R. & Nagy, Zoltán, 2019. "Reinforcement learning for demand response: A review of algorithms and modeling techniques," Applied Energy, Elsevier, vol. 235(C), pages 1072-1089.
    6. Jaehyun Lee & Eunjung Lee & Jinho Kim, 2020. "Electric Vehicle Charging and Discharging Algorithm Based on Reinforcement Learning with Data-Driven Approach in Dynamic Pricing Scheme," Energies, MDPI, vol. 13(8), pages 1-18, April.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Liu, Xiaochen & Fu, Zhi & Qiu, Siyuan & Zhang, Tao & Li, Shaojie & Yang, Zhi & Liu, Xiaohua & Jiang, Yi, 2023. "Charging private electric vehicles solely by photovoltaics: A battery-free direct-current microgrid with distributed charging strategy," Applied Energy, Elsevier, vol. 341(C).
    2. Sun, Chuyu & Zhao, Xiaoli & Qi, Binbin & Xiao, Weihao & Zhang, Hongjun, 2022. "Economic and environmental analysis of coupled PV-energy storage-charging station considering location and scale," Applied Energy, Elsevier, vol. 328(C).
    3. Xiaohan Fang & Moran Xu & Yuan Fan, 2024. "SOC - SOH Estimation and Balance Control Based on Event-Triggered Distributed Optimal Kalman Consensus Filter," Energies, MDPI, vol. 17(3), pages 1-19, January.
    4. Liu, Junling & Li, Mengyue & Xue, Liya & Kobashi, Takuro, 2022. "A framework to evaluate the energy-environment-economic impacts of developing rooftop photovoltaics integrated with electric vehicles at city level," Renewable Energy, Elsevier, vol. 200(C), pages 647-657.
    5. Fachrizal, Reza & Shepero, Mahmoud & Åberg, Magnus & Munkhammar, Joakim, 2022. "Optimal PV-EV sizing at solar powered workplace charging stations with smart charging schemes considering self-consumption and self-sufficiency balance," Applied Energy, Elsevier, vol. 307(C).
    6. Omar Al-Ani & Sanjoy Das, 2022. "Reinforcement Learning: Theory and Applications in HEMS," Energies, MDPI, vol. 15(17), pages 1-37, September.
    7. Liu, Xiaochen & Fu, Zhi & Qiu, Siyuan & Li, Shaojie & Zhang, Tao & Liu, Xiaohua & Jiang, Yi, 2023. "Building-centric investigation into electric vehicle behavior: A survey-based simulation method for charging system design," Energy, Elsevier, vol. 271(C).
    8. Qiu, Dawei & Wang, Yi & Hua, Weiqi & Strbac, Goran, 2023. "Reinforcement learning for electric vehicle applications in power systems:A critical review," Renewable and Sustainable Energy Reviews, Elsevier, vol. 173(C).
    9. Demir, Sumeyra & Stappers, Bart & Kok, Koen & Paterakis, Nikolaos G., 2022. "Statistical arbitrage trading on the intraday market using the asynchronous advantage actor–critic method," Applied Energy, Elsevier, vol. 314(C).
    10. Qiu, Dawei & Dong, Zihang & Zhang, Xi & Wang, Yi & Strbac, Goran, 2022. "Safe reinforcement learning for real-time automatic control in a smart energy-hub," Applied Energy, Elsevier, vol. 309(C).
    11. Zhu, Rui & Kondor, Dániel & Cheng, Cheng & Zhang, Xiaohu & Santi, Paolo & Wong, Man Sing & Ratti, Carlo, 2022. "Solar photovoltaic generation for charging shared electric scooters," Applied Energy, Elsevier, vol. 313(C).
    12. Ming, Fangzhu & Gao, Feng & Liu, Kun & Li, Xingqi, 2023. "A constrained DRL-based bi-level coordinated method for large-scale EVs charging," Applied Energy, Elsevier, vol. 331(C).
    13. Marina Dorokhova & Jérémie Vianin & Jean-Marie Alder & Christophe Ballif & Nicolas Wyrsch & David Wannier, 2021. "A Blockchain-Supported Framework for Charging Management of Electric Vehicles," Energies, MDPI, vol. 14(21), pages 1-32, November.
    14. Homod, Raad Z. & Togun, Hussein & Kadhim Hussein, Ahmed & Noraldeen Al-Mousawi, Fadhel & Yaseen, Zaher Mundher & Al-Kouz, Wael & Abd, Haider J. & Alawi, Omer A. & Goodarzi, Marjan & Hussein, Omar A., 2022. "Dynamics analysis of a novel hybrid deep clustering for unsupervised learning by reinforcement of multi-agent to energy saving in intelligent buildings," Applied Energy, Elsevier, vol. 313(C).
    15. Dimitrios Vamvakas & Panagiotis Michailidis & Christos Korkas & Elias Kosmatopoulos, 2023. "Review and Evaluation of Reinforcement Learning Frameworks on Smart Grid Applications," Energies, MDPI, vol. 16(14), pages 1-38, July.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Omar Al-Ani & Sanjoy Das, 2022. "Reinforcement Learning: Theory and Applications in HEMS," Energies, MDPI, vol. 15(17), pages 1-37, September.
    2. Gokhale, Gargya & Claessens, Bert & Develder, Chris, 2022. "Physics informed neural networks for control oriented thermal modeling of buildings," Applied Energy, Elsevier, vol. 314(C).
    3. Vašak, Mario & Banjac, Anita & Hure, Nikola & Novak, Hrvoje & Kovačević, Marko, 2023. "Predictive control based assessment of building demand flexibility in fixed time windows," Applied Energy, Elsevier, vol. 329(C).
    4. Hernandez-Matheus, Alejandro & Löschenbrand, Markus & Berg, Kjersti & Fuchs, Ida & Aragüés-Peñalba, Mònica & Bullich-Massagué, Eduard & Sumper, Andreas, 2022. "A systematic review of machine learning techniques related to local energy communities," Renewable and Sustainable Energy Reviews, Elsevier, vol. 170(C).
    5. Langer, Lissy & Volling, Thomas, 2020. "An optimal home energy management system for modulating heat pumps and photovoltaic systems," Applied Energy, Elsevier, vol. 278(C).
    6. Lilia Tightiz & Joon Yoo, 2022. "A Review on a Data-Driven Microgrid Management System Integrating an Active Distribution Network: Challenges, Issues, and New Trends," Energies, MDPI, vol. 15(22), pages 1-24, November.
    7. Svetozarevic, B. & Baumann, C. & Muntwiler, S. & Di Natale, L. & Zeilinger, M.N. & Heer, P., 2022. "Data-driven control of room temperature and bidirectional EV charging using deep reinforcement learning: Simulations and experiments," Applied Energy, Elsevier, vol. 307(C).
    8. Qiu, Dawei & Wang, Yi & Hua, Weiqi & Strbac, Goran, 2023. "Reinforcement learning for electric vehicle applications in power systems:A critical review," Renewable and Sustainable Energy Reviews, Elsevier, vol. 173(C).
    9. Van-Hai Bui & Akhtar Hussain & Hak-Man Kim, 2019. "Q-Learning-Based Operation Strategy for Community Battery Energy Storage System (CBESS) in Microgrid System," Energies, MDPI, vol. 12(9), pages 1-17, May.
    10. Jin, Ruiyang & Zhou, Yuke & Lu, Chao & Song, Jie, 2022. "Deep reinforcement learning-based strategy for charging station participating in demand response," Applied Energy, Elsevier, vol. 328(C).
    11. Federica Cucchiella & Idiano D’Adamo & Paolo Rosa, 2015. "Industrial Photovoltaic Systems: An Economic Analysis in Non-Subsidized Electricity Markets," Energies, MDPI, vol. 8(11), pages 1-16, November.
    12. Byungsung Lee & Haesung Lee & Hyun Ahn, 2020. "Improving Load Forecasting of Electric Vehicle Charging Stations Through Missing Data Imputation," Energies, MDPI, vol. 13(18), pages 1-15, September.
    13. Harri Aaltonen & Seppo Sierla & Rakshith Subramanya & Valeriy Vyatkin, 2021. "A Simulation Environment for Training a Reinforcement Learning Agent Trading a Battery Storage," Energies, MDPI, vol. 14(17), pages 1-20, September.
    14. Imen Azzouz & Wiem Fekih Hassen, 2023. "Optimization of Electric Vehicles Charging Scheduling Based on Deep Reinforcement Learning: A Decentralized Approach," Energies, MDPI, vol. 16(24), pages 1-18, December.
    15. Martín Pensado-Mariño & Lara Febrero-Garrido & Pablo Eguía-Oller & Enrique Granada-Álvarez, 2021. "Feasibility of Different Weather Data Sources Applied to Building Indoor Temperature Estimation Using LSTM Neural Networks," Sustainability, MDPI, vol. 13(24), pages 1-15, December.
    16. Shafqat Jawad & Junyong Liu, 2020. "Electrical Vehicle Charging Services Planning and Operation with Interdependent Power Networks and Transportation Networks: A Review of the Current Scenario and Future Trends," Energies, MDPI, vol. 13(13), pages 1-24, July.
    17. Alqahtani, Mohammed & Hu, Mengqi, 2022. "Dynamic energy scheduling and routing of multiple electric vehicles using deep reinforcement learning," Energy, Elsevier, vol. 244(PA).
    18. Reza Fachrizal & Joakim Munkhammar, 2020. "Improved Photovoltaic Self-Consumption in Residential Buildings with Distributed and Centralized Smart Charging of Electric Vehicles," Energies, MDPI, vol. 13(5), pages 1-19, March.
    19. Klein, Martin & Deissenroth, Marc, 2017. "When do households invest in solar photovoltaics? An application of prospect theory," Energy Policy, Elsevier, vol. 109(C), pages 270-278.
    20. Bernadette Fina & Hans Auer, 2020. "Economic Viability of Renewable Energy Communities under the Framework of the Renewable Energy Directive Transposed to Austrian Law," Energies, MDPI, vol. 13(21), pages 1-31, November.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:appene:v:301:y:2021:i:c:s0306261921008874. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/wps/find/journaldescription.cws_home/405891/description#description .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.