IDEAS home Printed from https://ideas.repec.org/a/eee/appene/v317y2022ics0306261922005566.html
   My bibliography  Save this article

Novel data-driven energy management of a hybrid photovoltaic-reverse osmosis desalination system using deep reinforcement learning

Author

Listed:
  • Soleimanzade, Mohammad Amin
  • Kumar, Amit
  • Sadrzadeh, Mohtada

Abstract

This paper proposes a novel deep reinforcement learning-accelerated energy management system for a hybrid grid-connected photovoltaic-reverse osmosis-pressure retarded osmosis desalination plant. The energy management problem is formulated as a partially observable Markov decision process by using historical photovoltaic (PV) power data in order to cope with uncertainties related to the generation of solar power and provide more information regarding the true state of the system. The soft actor-critic (SAC) algorithm is employed as the core of the energy management system to maximize water production rate and contaminant removal efficiency while minimizing the supplied power from the external grid. We introduce 1-dimensional convolutional neural networks (1-D CNNs) to the actor, critic, and value function networks of the SAC algorithm to address the partial observability dilemma involved in PV-powered energy systems, extract essential features from the PV power time series, and achieve immensely improved performance ultimately. Furthermore, it is assumed that the proposed CNN-SAC algorithm does not have access to the current output power data of the PV system. The development of more practical energy management systems necessitates this assumption, and we demonstrate that the proposed method is capable of forecasting the current PV power data. The superiority of the CNN-SAC model is verified by comparing its learning performance and simulation results with those of four state-of-the-art deep reinforcement learning algorithms: Deep deterministic policy gradient (DDPG), proximal policy optimization (PPO), twin delayed DDPG (TD3), and vanilla SAC. The results show that the CNN-SAC model outperforms the benchmark methods in terms of effective solar energy exploitation and power scheduling, manifesting the necessity of exploiting historical PV power data and 1-D CNNs. Moreover, the CNN-SAC algorithm is benchmarked against a powerful energy management system we developed in our previous investigation by studying three scenarios, and it is demonstrated that considerable improvement in energy efficiency can be obtained without using any solar power generation forecasting algorithm. By conducting ablation studies, the critical contribution of the introduced 1-D CNN is demonstrated, and we highlight the significance of providing historical PV power data for substantial performance enhancement. The average and standard deviation of evaluation scores obtained during the last stages of training reveal that the 1-D CNN significantly improves the final performance and stability of the SAC algorithm. These results demonstrate that the modifications we detail in our investigation render deep reinforcement learning algorithms extremely powerful for the energy management of PV-powered microgrids, including PV-driven reverse osmosis desalination plants.

Suggested Citation

  • Soleimanzade, Mohammad Amin & Kumar, Amit & Sadrzadeh, Mohtada, 2022. "Novel data-driven energy management of a hybrid photovoltaic-reverse osmosis desalination system using deep reinforcement learning," Applied Energy, Elsevier, vol. 317(C).
  • Handle: RePEc:eee:appene:v:317:y:2022:i:c:s0306261922005566
    DOI: 10.1016/j.apenergy.2022.119184
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0306261922005566
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.apenergy.2022.119184?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Kim, Jong Suk & Chen, Jun & Garcia, Humberto E., 2016. "Modeling, control, and dynamic performance analysis of a reverse osmosis desalination plant integrated within hybrid energy systems," Energy, Elsevier, vol. 112(C), pages 52-66.
    2. Perera, A.T.D. & Kamalaruban, Parameswaran, 2021. "Applications of reinforcement learning in energy systems," Renewable and Sustainable Energy Reviews, Elsevier, vol. 137(C).
    3. Pinto, Giuseppe & Piscitelli, Marco Savino & Vázquez-Canteli, José Ramón & Nagy, Zoltán & Capozzoli, Alfonso, 2021. "Coordinated energy management for a cluster of buildings through deep reinforcement learning," Energy, Elsevier, vol. 229(C).
    4. Hua, Haochen & Qin, Yuchao & Hao, Chuantong & Cao, Junwei, 2019. "Optimal energy management strategies for energy Internet via deep reinforcement learning approach," Applied Energy, Elsevier, vol. 239(C), pages 598-609.
    5. Altaee, Ali & Millar, Graeme J. & Zaragoza, Guillermo, 2016. "Integration and optimization of pressure retarded osmosis with reverse osmosis for power generation and high efficiency desalination," Energy, Elsevier, vol. 103(C), pages 110-118.
    6. Elsied, Moataz & Oukaour, Amrane & Youssef, Tarek & Gualous, Hamid & Mohammed, Osama, 2016. "An advanced real time energy management system for microgrids," Energy, Elsevier, vol. 114(C), pages 742-752.
    7. Kim, Jungbin & Park, Kiho & Yang, Dae Ryook & Hong, Seungkwan, 2019. "A comprehensive review of energy consumption of seawater reverse osmosis desalination plants," Applied Energy, Elsevier, vol. 254(C).
    8. Zang, Haixiang & Liu, Ling & Sun, Li & Cheng, Lilin & Wei, Zhinong & Sun, Guoqiang, 2020. "Short-term global horizontal irradiance forecasting based on a hybrid CNN-LSTM model with spatiotemporal correlations," Renewable Energy, Elsevier, vol. 160(C), pages 26-41.
    9. Kofinas, P. & Dounis, A.I. & Vouros, G.A., 2018. "Fuzzy Q-Learning for multi-agent decentralized energy management in microgrids," Applied Energy, Elsevier, vol. 219(C), pages 53-67.
    10. Das, Utpal Kumar & Tey, Kok Soon & Seyedmahmoudian, Mehdi & Mekhilef, Saad & Idris, Moh Yamani Idna & Van Deventer, Willem & Horan, Bend & Stojcevski, Alex, 2018. "Forecasting of photovoltaic power generation and model optimization: A review," Renewable and Sustainable Energy Reviews, Elsevier, vol. 81(P1), pages 912-928.
    11. Ying Ji & Jianhui Wang & Jiacan Xu & Xiaoke Fang & Huaguang Zhang, 2019. "Real-Time Energy Management of a Microgrid Using Deep Reinforcement Learning," Energies, MDPI, vol. 12(12), pages 1-21, June.
    12. Harrold, Daniel J.B. & Cao, Jun & Fan, Zhong, 2022. "Data-driven battery operation for energy arbitrage using rainbow deep reinforcement learning," Energy, Elsevier, vol. 238(PC).
    13. Yujian Ye & Dawei Qiu & Huiyu Wang & Yi Tang & Goran Strbac, 2021. "Real-Time Autonomous Residential Demand Response Management Based on Twin Delayed Deep Deterministic Policy Gradient Learning," Energies, MDPI, vol. 14(3), pages 1-22, January.
    14. Guo, Chenyu & Wang, Xin & Zheng, Yihui & Zhang, Feng, 2022. "Real-time optimal energy management of microgrid with uncertainties based on deep reinforcement learning," Energy, Elsevier, vol. 238(PC).
    15. Soleimanzade, Mohammad Amin & Sadrzadeh, Mohtada, 2021. "Deep learning-based energy management of a hybrid photovoltaic-reverse osmosis-pressure retarded osmosis system," Applied Energy, Elsevier, vol. 293(C).
    16. Zia, Muhammad Fahad & Elbouchikhi, Elhoussin & Benbouzid, Mohamed, 2018. "Microgrids energy management systems: A critical review on methods, solutions, and prospects," Applied Energy, Elsevier, vol. 222(C), pages 1033-1055.
    17. Jeong, Jaeik & Kim, Hongseok, 2021. "DeepComp: Deep reinforcement learning based renewable energy error compensable forecasting," Applied Energy, Elsevier, vol. 294(C).
    18. He, Wei & Wang, Yang & Shaheed, Mohammad Hasan, 2015. "Stand-alone seawater RO (reverse osmosis) desalination powered by PV (photovoltaic) and PRO (pressure retarded osmosis)," Energy, Elsevier, vol. 86(C), pages 423-435.
    19. Prante, Jeri L. & Ruskowitz, Jeffrey A. & Childress, Amy E. & Achilli, Andrea, 2014. "RO-PRO desalination: An integrated low-energy approach to seawater desalination," Applied Energy, Elsevier, vol. 120(C), pages 104-114.
    20. Roslan, M.F. & Hannan, M.A. & Ker, Pin Jern & Uddin, M.N., 2019. "Microgrid control methods toward achieving sustainable energy management," Applied Energy, Elsevier, vol. 240(C), pages 583-607.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Jiankai Gao & Yang Li & Bin Wang & Haibo Wu, 2023. "Multi-Microgrid Collaborative Optimization Scheduling Using an Improved Multi-Agent Soft Actor-Critic Algorithm," Energies, MDPI, vol. 16(7), pages 1-21, April.
    2. Elsir, Mohamed & Al-Sumaiti, Ameena Saad & El Moursi, Mohamed Shawky & Al-Awami, Ali Taleb, 2023. "Coordinating the day-ahead operation scheduling for demand response and water desalination plants in smart grid," Applied Energy, Elsevier, vol. 335(C).
    3. Xu, Jiacheng & Liang, Yingzong & Luo, Xianglong & Chen, Jianyong & Yang, Zhi & Chen, Ying, 2023. "Towards cost-effective osmotic power harnessing: Mass exchanger network synthesis for multi-stream pressure-retarded osmosis systems," Applied Energy, Elsevier, vol. 330(PA).

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Zhu, Ziqing & Hu, Ze & Chan, Ka Wing & Bu, Siqi & Zhou, Bin & Xia, Shiwei, 2023. "Reinforcement learning in deregulated energy market: A comprehensive review," Applied Energy, Elsevier, vol. 329(C).
    2. Omar Al-Ani & Sanjoy Das, 2022. "Reinforcement Learning: Theory and Applications in HEMS," Energies, MDPI, vol. 15(17), pages 1-37, September.
    3. Bio Gassi, Karim & Baysal, Mustafa, 2023. "Improving real-time energy decision-making model with an actor-critic agent in modern microgrids with energy storage devices," Energy, Elsevier, vol. 263(PE).
    4. Dimitrios Vamvakas & Panagiotis Michailidis & Christos Korkas & Elias Kosmatopoulos, 2023. "Review and Evaluation of Reinforcement Learning Frameworks on Smart Grid Applications," Energies, MDPI, vol. 16(14), pages 1-38, July.
    5. Zhou, Yanting & Ma, Zhongjing & Zhang, Jinhui & Zou, Suli, 2022. "Data-driven stochastic energy management of multi energy system using deep reinforcement learning," Energy, Elsevier, vol. 261(PA).
    6. Wang, Yi & Qiu, Dawei & Sun, Mingyang & Strbac, Goran & Gao, Zhiwei, 2023. "Secure energy management of multi-energy microgrid: A physical-informed safe reinforcement learning approach," Applied Energy, Elsevier, vol. 335(C).
    7. Harrold, Daniel J.B. & Cao, Jun & Fan, Zhong, 2022. "Data-driven battery operation for energy arbitrage using rainbow deep reinforcement learning," Energy, Elsevier, vol. 238(PC).
    8. Lilia Tightiz & Joon Yoo, 2022. "A Review on a Data-Driven Microgrid Management System Integrating an Active Distribution Network: Challenges, Issues, and New Trends," Energies, MDPI, vol. 15(22), pages 1-24, November.
    9. Perera, A.T.D. & Kamalaruban, Parameswaran, 2021. "Applications of reinforcement learning in energy systems," Renewable and Sustainable Energy Reviews, Elsevier, vol. 137(C).
    10. Touati, Khaled & Tadeo, Fernando & Elfil, Hamza, 2017. "Osmotic energy recovery from Reverse Osmosis using two-stage Pressure Retarded Osmosis," Energy, Elsevier, vol. 132(C), pages 213-224.
    11. Harrold, Daniel J.B. & Cao, Jun & Fan, Zhong, 2022. "Renewable energy integration and microgrid energy trading using multi-agent deep reinforcement learning," Applied Energy, Elsevier, vol. 318(C).
    12. Soleimanzade, Mohammad Amin & Sadrzadeh, Mohtada, 2021. "Deep learning-based energy management of a hybrid photovoltaic-reverse osmosis-pressure retarded osmosis system," Applied Energy, Elsevier, vol. 293(C).
    13. Ana Cabrera-Tobar & Alessandro Massi Pavan & Giovanni Petrone & Giovanni Spagnuolo, 2022. "A Review of the Optimization and Control Techniques in the Presence of Uncertainties for the Energy Management of Microgrids," Energies, MDPI, vol. 15(23), pages 1-38, December.
    14. Àlex Alonso-Travesset & Helena Martín & Sergio Coronas & Jordi de la Hoz, 2022. "Optimization Models under Uncertainty in Distributed Generation Systems: A Review," Energies, MDPI, vol. 15(5), pages 1-40, March.
    15. Giorgia Tomassi & Pietro Romano & Gabriele Di Giacomo, 2021. "Modern Use of Water Produced by Purification of Municipal Wastewater: A Case Study," Energies, MDPI, vol. 14(22), pages 1-13, November.
    16. Gui, Yonghao & Wei, Baoze & Li, Mingshen & Guerrero, Josep M. & Vasquez, Juan C., 2018. "Passivity-based coordinated control for islanded AC microgrid," Applied Energy, Elsevier, vol. 229(C), pages 551-561.
    17. Esmaeil Ahmadi & Benjamin McLellan & Behnam Mohammadi-Ivatloo & Tetsuo Tezuka, 2020. "The Role of Renewable Energy Resources in Sustainability of Water Desalination as a Potential Fresh-Water Source: An Updated Review," Sustainability, MDPI, vol. 12(13), pages 1-31, June.
    18. Younes Zahraoui & Ibrahim Alhamrouni & Saad Mekhilef & M. Reyasudin Basir Khan & Mehdi Seyedmahmoudian & Alex Stojcevski & Ben Horan, 2021. "Energy Management System in Microgrids: A Comprehensive Review," Sustainability, MDPI, vol. 13(19), pages 1-33, September.
    19. Moosazadeh, Mohammad & Tariq, Shahzeb & Safder, Usman & Yoo, ChangKyoo, 2023. "Techno-economic feasibility and environmental impact evaluation of a hybrid solar thermal membrane-based power desalination system," Energy, Elsevier, vol. 278(PA).
    20. Pinto, Giuseppe & Deltetto, Davide & Capozzoli, Alfonso, 2021. "Data-driven district energy management with surrogate models and deep reinforcement learning," Applied Energy, Elsevier, vol. 304(C).

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:appene:v:317:y:2022:i:c:s0306261922005566. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/wps/find/journaldescription.cws_home/405891/description#description .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.