IDEAS home Printed from https://ideas.repec.org/a/eee/energy/v283y2023ics036054422302491x.html
   My bibliography  Save this article

A deep reinforcement learning approach for power management of battery-assisted fast-charging EV hubs participating in day-ahead and real-time electricity markets

Author

Listed:
  • Paudel, Diwas
  • Das, Tapas K.

Abstract

Publicly available electric vehicle charging hubs are expected to grow, to meet the increasing charging demand of EVs. A dominant class of these will be fast-charging hubs where the EVs will arrive for charging at all hours of the day, get the requested charge, and leave promptly. The profitability of these fast-charging hubs will be highly dependent on the variation of the day-ahead prices of electricity, volatility of the real-time power market, and the randomness of EV charging demand. The hubs can hedge against these uncertainties by committing power purchases in the day-ahead electricity market and by adopting dynamic real-time power management strategies. We develop a novel two-step methodology. The first step entails a mixed integer linear program (MILP) that assists the hubs in their day-ahead power commitment. The second step employs a Markov decision process (MDP) model that derives the real-time power management control actions. The MILP is solved using a commercial solver and the MDP is solved using a deep reinforcement learning algorithm. We demonstrate the effectiveness of our methodology for a fast-charging hub, housing 150 charging stations and a battery storage system, that operates in the Pennsylvania-New Jersey- Maryland interconnection (PJM) power grid.

Suggested Citation

  • Paudel, Diwas & Das, Tapas K., 2023. "A deep reinforcement learning approach for power management of battery-assisted fast-charging EV hubs participating in day-ahead and real-time electricity markets," Energy, Elsevier, vol. 283(C).
  • Handle: RePEc:eee:energy:v:283:y:2023:i:c:s036054422302491x
    DOI: 10.1016/j.energy.2023.129097
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S036054422302491X
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.energy.2023.129097?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Zhao, Zhonghao & Lee, Carman K.M. & Huo, Jiage, 2023. "EV charging station deployment on coupled transportation and power distribution networks via reinforcement learning," Energy, Elsevier, vol. 267(C).
    2. Subramanian, Vignesh & Das, Tapas K., 2019. "A two-layer model for dynamic pricing of electricity and optimal charging of electric vehicles under price spikes," Energy, Elsevier, vol. 167(C), pages 1266-1277.
    3. Rehman, Waqas ur & Bo, Rui & Mehdipourpicha, Hossein & Kimball, Jonathan W., 2022. "Sizing battery energy storage and PV system in an extreme fast charging station considering uncertainties and battery degradation," Applied Energy, Elsevier, vol. 313(C).
    4. Melendez, Kevin A. & Das, Tapas K. & Kwon, Changhyun, 2020. "Optimal operation of a system of charging hubs and a fleet of shared autonomous electric vehicles," Applied Energy, Elsevier, vol. 279(C).
    5. Touzani, Samir & Prakash, Anand Krishnan & Wang, Zhe & Agarwal, Shreya & Pritoni, Marco & Kiran, Mariam & Brown, Richard & Granderson, Jessica, 2021. "Controlling distributed energy resources via deep reinforcement learning for load flexibility and energy efficiency," Applied Energy, Elsevier, vol. 304(C).
    6. Harrold, Daniel J.B. & Cao, Jun & Fan, Zhong, 2022. "Renewable energy integration and microgrid energy trading using multi-agent deep reinforcement learning," Applied Energy, Elsevier, vol. 318(C).
    7. Lai, Chun Sing & Chen, Dashen & Zhang, Jinning & Zhang, Xin & Xu, Xu & Taylor, Gareth A. & Lai, Loi Lei, 2022. "Profit maximization for large-scale energy storage systems to enable fast EV charging infrastructure in distribution networks," Energy, Elsevier, vol. 259(C).
    8. Melendez, Kevin A. & Subramanian, Vignesh & Das, Tapas K. & Kwon, Changhyun, 2019. "Empowering end-use consumers of electricity to aggregate for demand-side participation," Applied Energy, Elsevier, vol. 248(C), pages 372-382.
    9. Zareipour, Hamidreza & Bhattacharya, Kankar & Canizares, Claudio A., 2007. "Electricity market price volatility: The case of Ontario," Energy Policy, Elsevier, vol. 35(9), pages 4739-4748, September.
    10. Zheng, Yanchong & Yu, Hang & Shao, Ziyun & Jian, Linni, 2020. "Day-ahead bidding strategy for electric vehicle aggregator enabling multiple agent modes in uncertain electricity markets," Applied Energy, Elsevier, vol. 280(C).
    11. Tuchnitz, Felix & Ebell, Niklas & Schlund, Jonas & Pruckner, Marco, 2021. "Development and Evaluation of a Smart Charging Strategy for an Electric Vehicle Fleet Based on Reinforcement Learning," Applied Energy, Elsevier, vol. 285(C).
    12. Alqahtani, Mohammed & Hu, Mengqi, 2022. "Dynamic energy scheduling and routing of multiple electric vehicles using deep reinforcement learning," Energy, Elsevier, vol. 244(PA).
    13. Lee, Sangyoon & Choi, Dae-Hyun, 2021. "Dynamic pricing and energy management for profit maximization in multiple smart electric vehicle charging stations: A privacy-preserving deep reinforcement learning approach," Applied Energy, Elsevier, vol. 304(C).
    14. Elma, Onur, 2020. "A dynamic charging strategy with hybrid fast charging station for electric vehicles," Energy, Elsevier, vol. 202(C).
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Zhao, Zhonghao & Lee, Carman K.M. & Huo, Jiage, 2023. "EV charging station deployment on coupled transportation and power distribution networks via reinforcement learning," Energy, Elsevier, vol. 267(C).
    2. Subramanian, Vignesh & Feijoo, Felipe & Sankaranarayanan, Sriram & Melendez, Kevin & Das, Tapas K., 2022. "A bilevel conic optimization model for routing and charging of EV fleets serving long distance delivery networks," Energy, Elsevier, vol. 251(C).
    3. Zhao, Zhonghao & Lee, Carman K.M. & Ren, Jingzheng, 2024. "A two-level charging scheduling method for public electric vehicle charging stations considering heterogeneous demand and nonlinear charging profile," Applied Energy, Elsevier, vol. 355(C).
    4. Pegah Alaee & Julius Bems & Amjad Anvari-Moghaddam, 2023. "A Review of the Latest Trends in Technical and Economic Aspects of EV Charging Management," Energies, MDPI, vol. 16(9), pages 1-28, April.
    5. Qiu, Dawei & Wang, Yi & Hua, Weiqi & Strbac, Goran, 2023. "Reinforcement learning for electric vehicle applications in power systems:A critical review," Renewable and Sustainable Energy Reviews, Elsevier, vol. 173(C).
    6. Jin, Ruiyang & Zhou, Yuke & Lu, Chao & Song, Jie, 2022. "Deep reinforcement learning-based strategy for charging station participating in demand response," Applied Energy, Elsevier, vol. 328(C).
    7. Melendez, Kevin A. & Das, Tapas K. & Kwon, Changhyun, 2020. "Optimal operation of a system of charging hubs and a fleet of shared autonomous electric vehicles," Applied Energy, Elsevier, vol. 279(C).
    8. Anis ur Rehman & Muhammad Ali & Sheeraz Iqbal & Aqib Shafiq & Nasim Ullah & Sattam Al Otaibi, 2022. "Artificial Intelligence-Based Control and Coordination of Multiple PV Inverters for Reactive Power/Voltage Control of Power Distribution Networks," Energies, MDPI, vol. 15(17), pages 1-13, August.
    9. Omar Al-Ani & Sanjoy Das, 2022. "Reinforcement Learning: Theory and Applications in HEMS," Energies, MDPI, vol. 15(17), pages 1-37, September.
    10. Homod, Raad Z. & Togun, Hussein & Kadhim Hussein, Ahmed & Noraldeen Al-Mousawi, Fadhel & Yaseen, Zaher Mundher & Al-Kouz, Wael & Abd, Haider J. & Alawi, Omer A. & Goodarzi, Marjan & Hussein, Omar A., 2022. "Dynamics analysis of a novel hybrid deep clustering for unsupervised learning by reinforcement of multi-agent to energy saving in intelligent buildings," Applied Energy, Elsevier, vol. 313(C).
    11. Zhang, Shulei & Jia, Runda & Pan, Hengxin & Cao, Yankai, 2023. "A safe reinforcement learning-based charging strategy for electric vehicles in residential microgrid," Applied Energy, Elsevier, vol. 348(C).
    12. Tepe, Benedikt & Figgener, Jan & Englberger, Stefan & Sauer, Dirk Uwe & Jossen, Andreas & Hesse, Holger, 2022. "Optimal pool composition of commercial electric vehicles in V2G fleet operation of various electricity markets," Applied Energy, Elsevier, vol. 308(C).
    13. Yin, Rumeng & He, Jiang, 2023. "Design of a photovoltaic electric bike battery-sharing system in public transit stations," Applied Energy, Elsevier, vol. 332(C).
    14. Samuel M. Muhindo & Roland P. Malhamé & Geza Joos, 2021. "A Novel Mean Field Game-Based Strategy for Charging Electric Vehicles in Solar Powered Parking Lots," Energies, MDPI, vol. 14(24), pages 1-21, December.
    15. Li, Yanxue & Wang, Zixuan & Xu, Wenya & Gao, Weijun & Xu, Yang & Xiao, Fu, 2023. "Modeling and energy dynamic control for a ZEH via hybrid model-based deep reinforcement learning," Energy, Elsevier, vol. 277(C).
    16. Zhang, Bin & Hu, Weihao & Ghias, Amer M.Y.M. & Xu, Xiao & Chen, Zhe, 2022. "Multi-agent deep reinforcement learning-based coordination control for grid-aware multi-buildings," Applied Energy, Elsevier, vol. 328(C).
    17. Daniel Manfre Jaimes & Manuel Zamudio López & Hamidreza Zareipour & Mike Quashie, 2023. "A Hybrid Model for Multi-Day-Ahead Electricity Price Forecasting considering Price Spikes," Forecasting, MDPI, vol. 5(3), pages 1-23, July.
    18. Vinyals, Meritxell, 2021. "Scalable multi-agent local energy trading — Meeting regulatory compliance and validation in the Cardiff grid," Applied Energy, Elsevier, vol. 298(C).
    19. Seongwoo Lee & Joonho Seon & Byungsun Hwang & Soohyun Kim & Youngghyu Sun & Jinyoung Kim, 2024. "Recent Trends and Issues of Energy Management Systems Using Machine Learning," Energies, MDPI, vol. 17(3), pages 1-24, January.
    20. Sohani, Ali & Cornaro, Cristina & Shahverdian, Mohammad Hassan & Moser, David & Pierro, Marco & Olabi, Abdul Ghani & Karimi, Nader & Nižetić, Sandro & Li, Larry K.B. & Doranehgard, Mohammad Hossein, 2023. "Techno-economic evaluation of a hybrid photovoltaic system with hot/cold water storage for poly-generation in a residential building," Applied Energy, Elsevier, vol. 331(C).

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:energy:v:283:y:2023:i:c:s036054422302491x. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.journals.elsevier.com/energy .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.