IDEAS home Printed from https://ideas.repec.org/a/gam/jeners/v14y2021i17p5587-d630250.html
   My bibliography  Save this article

A Simulation Environment for Training a Reinforcement Learning Agent Trading a Battery Storage

Author

Listed:
  • Harri Aaltonen

    (Department of Electrical Engineering and Automation, School of Electrical Engineering, Aalto University, FI-00076 Espoo, Finland)

  • Seppo Sierla

    (Department of Electrical Engineering and Automation, School of Electrical Engineering, Aalto University, FI-00076 Espoo, Finland)

  • Rakshith Subramanya

    (Department of Electrical Engineering and Automation, School of Electrical Engineering, Aalto University, FI-00076 Espoo, Finland)

  • Valeriy Vyatkin

    (Department of Electrical Engineering and Automation, School of Electrical Engineering, Aalto University, FI-00076 Espoo, Finland
    Department of Computer Science, Electrical and Space Engineering, Luleå University of Technology, 97187 Luleå, Sweden
    International Research Laboratory of Computer Technologies, ITMO University, 197101 St. Petersburg, Russia)

Abstract

Battery storages are an essential element of the emerging smart grid. Compared to other distributed intelligent energy resources, batteries have the advantage of being able to rapidly react to events such as renewable generation fluctuations or grid disturbances. There is a lack of research on ways to profitably exploit this ability. Any solution needs to consider rapid electrical phenomena as well as the much slower dynamics of relevant electricity markets. Reinforcement learning is a branch of artificial intelligence that has shown promise in optimizing complex problems involving uncertainty. This article applies reinforcement learning to the problem of trading batteries. The problem involves two timescales, both of which are important for profitability. Firstly, trading the battery capacity must occur on the timescale of the chosen electricity markets. Secondly, the real-time operation of the battery must ensure that no financial penalties are incurred from failing to meet the technical specification. The trading-related decisions must be done under uncertainties, such as unknown future market prices and unpredictable power grid disturbances. In this article, a simulation model of a battery system is proposed as the environment to train a reinforcement learning agent to make such decisions. The system is demonstrated with an application of the battery to Finnish primary frequency reserve markets.

Suggested Citation

  • Harri Aaltonen & Seppo Sierla & Rakshith Subramanya & Valeriy Vyatkin, 2021. "A Simulation Environment for Training a Reinforcement Learning Agent Trading a Battery Storage," Energies, MDPI, vol. 14(17), pages 1-20, September.
  • Handle: RePEc:gam:jeners:v:14:y:2021:i:17:p:5587-:d:630250
    as

    Download full text from publisher

    File URL: https://www.mdpi.com/1996-1073/14/17/5587/pdf
    Download Restriction: no

    File URL: https://www.mdpi.com/1996-1073/14/17/5587/
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. Tsianikas, Stamatis & Yousefi, Nooshin & Zhou, Jian & Rodgers, Mark D. & Coit, David, 2021. "A storage expansion planning framework using reinforcement learning and simulation-based optimization," Applied Energy, Elsevier, vol. 290(C).
    2. Sunyong Kim & Hyuk Lim, 2018. "Reinforcement Learning Based Energy Management Algorithm for Smart Energy Buildings," Energies, MDPI, vol. 11(8), pages 1-19, August.
    3. Herre, Lars & Tomasini, Federica & Paridari, Kaveh & Söder, Lennart & Nordström, Lars, 2020. "Simplified model of integrated paper mill for optimal bidding in energy and reserve markets," Applied Energy, Elsevier, vol. 279(C).
    4. Bialek, Janusz, 2020. "What does the GB power outage on 9 August 2019 tell us about the current state of decarbonised power systems?," Energy Policy, Elsevier, vol. 146(C).
    5. Rakshith Subramanya & Matti Yli-Ojanperä & Seppo Sierla & Taneli Hölttä & Jori Valtakari & Valeriy Vyatkin, 2021. "A Virtual Power Plant Solution for Aggregating Photovoltaic Systems and Other Distributed Energy Resources for Northern European Primary Frequency Reserves," Energies, MDPI, vol. 14(5), pages 1-23, February.
    6. Grace Muriithi & Sunetra Chowdhury, 2021. "Optimal Energy Management of a Grid-Tied Solar PV-Battery Microgrid: A Reinforcement Learning Approach," Energies, MDPI, vol. 14(9), pages 1-24, May.
    7. Yu Sui & Shiming Song, 2020. "A Multi-Agent Reinforcement Learning Framework for Lithium-ion Battery Scheduling Problems," Energies, MDPI, vol. 13(8), pages 1-13, April.
    8. Evgeny Nefedov & Seppo Sierla & Valeriy Vyatkin, 2018. "Internet of Energy Approach for Sustainable Use of Electric Vehicles as Energy Storage of Prosumer Buildings," Energies, MDPI, vol. 11(8), pages 1-18, August.
    9. Jin-Gyeom Kim & Bowon Lee, 2020. "Automatic P2P Energy Trading Model Based on Reinforcement Learning Using Long Short-Term Delayed Reward," Energies, MDPI, vol. 13(20), pages 1-27, October.
    10. Malik, Anam & Ravishankar, Jayashri, 2018. "A hybrid control approach for regulating frequency through demand response," Applied Energy, Elsevier, vol. 210(C), pages 1347-1362.
    11. Loukatou, Angeliki & Johnson, Paul & Howell, Sydney & Duck, Peter, 2021. "Optimal valuation of wind energy projects co-located with battery storage," Applied Energy, Elsevier, vol. 283(C).
    12. Bialek, J., 2020. "What does the power outage on 9 August 2019 tell us about GB power system," Cambridge Working Papers in Economics 2018, Faculty of Economics, University of Cambridge.
    13. Sepúlveda-Mora, Sergio B. & Hegedus, Steven, 2021. "Making the case for time-of-use electric rates to boost the value of battery storage in commercial buildings with grid connected PV systems," Energy, Elsevier, vol. 218(C).
    14. Denis Sidorov & Daniil Panasetsky & Nikita Tomin & Dmitriy Karamov & Aleksei Zhukov & Ildar Muftahov & Aliona Dreglea & Fang Liu & Yong Li, 2020. "Toward Zero-Emission Hybrid AC/DC Power Systems with Renewable Energy Sources and Storages: A Case Study from Lake Baikal Region," Energies, MDPI, vol. 13(5), pages 1-18, March.
    15. Killer, Marvin & Farrokhseresht, Mana & Paterakis, Nikolaos G., 2020. "Implementation of large-scale Li-ion battery energy storage systems within the EMEA region," Applied Energy, Elsevier, vol. 260(C).
    16. Brida V. Mbuwir & Frederik Ruelens & Fred Spiessens & Geert Deconinck, 2017. "Battery Energy Management in a Microgrid Using Batch Reinforcement Learning," Energies, MDPI, vol. 10(11), pages 1-19, November.
    17. Pavić, Ivan & Capuder, Tomislav & Kuzle, Igor, 2016. "Low carbon technologies as providers of operational flexibility in future power systems," Applied Energy, Elsevier, vol. 168(C), pages 724-738.
    18. Hyukjoon Lee & Dongjin Ji & Dong-Ho Cho, 2019. "Optimal Design of Wireless Charging Electric Bus System Based on Reinforcement Learning," Energies, MDPI, vol. 12(7), pages 1-20, March.
    19. Ning Wang & Weisheng Xu & Weihui Shao & Zhiyu Xu, 2019. "A Q-Cube Framework of Reinforcement Learning Algorithm for Continuous Double Auction among Microgrids," Energies, MDPI, vol. 12(15), pages 1-26, July.
    20. Christian Giovanelli & Seppo Sierla & Ryutaro Ichise & Valeriy Vyatkin, 2018. "Exploiting Artificial Neural Networks for the Prediction of Ancillary Energy Market Prices," Energies, MDPI, vol. 11(7), pages 1-22, July.
    21. Xu, Bin & Shi, Junzhe & Li, Sixu & Li, Huayi & Wang, Zhe, 2021. "Energy consumption and battery aging minimization using a Q-learning strategy for a battery/ultracapacitor electric vehicle," Energy, Elsevier, vol. 229(C).
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Harri Aaltonen & Seppo Sierla & Ville Kyrki & Mahdi Pourakbari-Kasmaei & Valeriy Vyatkin, 2022. "Bidding a Battery on Electricity Markets and Minimizing Battery Aging Costs: A Reinforcement Learning Approach," Energies, MDPI, vol. 15(14), pages 1-19, July.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Harri Aaltonen & Seppo Sierla & Ville Kyrki & Mahdi Pourakbari-Kasmaei & Valeriy Vyatkin, 2022. "Bidding a Battery on Electricity Markets and Minimizing Battery Aging Costs: A Reinforcement Learning Approach," Energies, MDPI, vol. 15(14), pages 1-19, July.
    2. Niko Karhula & Seppo Sierla & Valeriy Vyatkin, 2021. "Validating the Real-Time Performance of Distributed Energy Resources Participating on Primary Frequency Reserves," Energies, MDPI, vol. 14(21), pages 1-19, October.
    3. Ziqian Zhang & Carina Lehmal & Philipp Hackl & Robert Schuerhuber, 2022. "Transient Stability Analysis and Post-Fault Restart Strategy for Current-Limited Grid-Forming Converter," Energies, MDPI, vol. 15(10), pages 1-26, May.
    4. Zhu, Ziqing & Hu, Ze & Chan, Ka Wing & Bu, Siqi & Zhou, Bin & Xia, Shiwei, 2023. "Reinforcement learning in deregulated energy market: A comprehensive review," Applied Energy, Elsevier, vol. 329(C).
    5. Alexander N. Kozlov & Nikita V. Tomin & Denis N. Sidorov & Electo E. S. Lora & Victor G. Kurbatsky, 2020. "Optimal Operation Control of PV-Biomass Gasifier-Diesel-Hybrid Systems Using Reinforcement Learning Techniques," Energies, MDPI, vol. 13(10), pages 1-20, May.
    6. Bio Gassi, Karim & Baysal, Mustafa, 2023. "Improving real-time energy decision-making model with an actor-critic agent in modern microgrids with energy storage devices," Energy, Elsevier, vol. 263(PE).
    7. Hu, Chenxi & Zhang, Jun & Yuan, Hongxia & Gao, Tianlu & Jiang, Huaiguang & Yan, Jing & Wenzhong Gao, David & Wang, Fei-Yue, 2022. "Black swan event small-sample transfer learning (BEST-L) and its case study on electrical power prediction in COVID-19," Applied Energy, Elsevier, vol. 309(C).
    8. Bomela, Walter & Zlotnik, Anatoly & Li, Jr-Shin, 2018. "A phase model approach for thermostatically controlled load demand response," Applied Energy, Elsevier, vol. 228(C), pages 667-680.
    9. Grace Muriithi & Sunetra Chowdhury, 2021. "Optimal Energy Management of a Grid-Tied Solar PV-Battery Microgrid: A Reinforcement Learning Approach," Energies, MDPI, vol. 14(9), pages 1-24, May.
    10. Khawaja Haider Ali & Mohammad Abusara & Asif Ali Tahir & Saptarshi Das, 2023. "Dual-Layer Q-Learning Strategy for Energy Management of Battery Storage in Grid-Connected Microgrids," Energies, MDPI, vol. 16(3), pages 1-17, January.
    11. Lilia Tightiz & Joon Yoo, 2022. "A Review on a Data-Driven Microgrid Management System Integrating an Active Distribution Network: Challenges, Issues, and New Trends," Energies, MDPI, vol. 15(22), pages 1-24, November.
    12. Ying Ji & Jianhui Wang & Jiacan Xu & Xiaoke Fang & Huaguang Zhang, 2019. "Real-Time Energy Management of a Microgrid Using Deep Reinforcement Learning," Energies, MDPI, vol. 12(12), pages 1-21, June.
    13. Ritu Kandari & Neeraj Neeraj & Alexander Micallef, 2022. "Review on Recent Strategies for Integrating Energy Storage Systems in Microgrids," Energies, MDPI, vol. 16(1), pages 1-24, December.
    14. Stelios C. Dimoulias & Eleftherios O. Kontis & Grigoris K. Papagiannis, 2022. "Inertia Estimation of Synchronous Devices: Review of Available Techniques and Comparative Assessment of Conventional Measurement-Based Approaches," Energies, MDPI, vol. 15(20), pages 1-30, October.
    15. Ghafoori, Mahdi & Abdallah, Moatassem & Kim, Serena, 2023. "Electricity peak shaving for commercial buildings using machine learning and vehicle to building (V2B) system," Applied Energy, Elsevier, vol. 340(C).
    16. Khawaja Haider Ali & Marvin Sigalo & Saptarshi Das & Enrico Anderlini & Asif Ali Tahir & Mohammad Abusara, 2021. "Reinforcement Learning for Energy-Storage Systems in Grid-Connected Microgrids: An Investigation of Online vs. Offline Implementation," Energies, MDPI, vol. 14(18), pages 1-18, September.
    17. Van-Hai Bui & Akhtar Hussain & Hak-Man Kim, 2019. "Q-Learning-Based Operation Strategy for Community Battery Energy Storage System (CBESS) in Microgrid System," Energies, MDPI, vol. 12(9), pages 1-17, May.
    18. Alqahtani, Mohammed & Hu, Mengqi, 2022. "Dynamic energy scheduling and routing of multiple electric vehicles using deep reinforcement learning," Energy, Elsevier, vol. 244(PA).
    19. Jing Qian & Yakun Guo & Yidong Zou & Shige Yu, 2021. "Hamiltonian Modeling and Structure Modified Control of Diesel Engine," Energies, MDPI, vol. 14(7), pages 1-13, April.
    20. Jenkins, J.D. & Zhou, Z. & Ponciroli, R. & Vilim, R.B. & Ganda, F. & de Sisternes, F. & Botterud, A., 2018. "The benefits of nuclear flexibility in power system operations with renewable energy," Applied Energy, Elsevier, vol. 222(C), pages 872-884.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:gam:jeners:v:14:y:2021:i:17:p:5587-:d:630250. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: MDPI Indexing Manager (email available below). General contact details of provider: https://www.mdpi.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.