IDEAS home Printed from https://ideas.repec.org/a/eee/energy/v205y2020ics0360544220311269.html
   My bibliography  Save this article

Modified deep learning and reinforcement learning for an incentive-based demand response model

Author

Listed:
  • Wen, Lulu
  • Zhou, Kaile
  • Li, Jun
  • Wang, Shanyong

Abstract

Incentive-based demand response (DR) program can induce end users (EUs) to reduce electricity demand during peak period through rewards. In this study, an incentive-based DR program with modified deep learning and reinforcement learning is proposed. A modified deep learning model based on recurrent neural network (MDL-RNN) was first proposed to identify the future uncertainties of environment by forecasting day-ahead wholesale electricity price, photovoltaic (PV) power output, and power load. Then, reinforcement learning (RL) was utilized to explore the optimal incentive rates at each hour which can maximize the profits of both energy service providers (ESPs) and EUs. The results showed that the proposed modified deep learning model can achieve more accurate forecasting results compared with some other methods. It can support the development of incentive-based DR programs under uncertain environment. Meanwhile, the optimized incentive rate can increase the total profits of ESPs and EUs while reducing the peak electricity demand. A short-term DR program was developed for peak electricity demand period, and the experimental results show that peak electricity demand can be reduced by 17%. This contributes to mitigating the supply-demand imbalance and enhancing power system security.

Suggested Citation

  • Wen, Lulu & Zhou, Kaile & Li, Jun & Wang, Shanyong, 2020. "Modified deep learning and reinforcement learning for an incentive-based demand response model," Energy, Elsevier, vol. 205(C).
  • Handle: RePEc:eee:energy:v:205:y:2020:i:c:s0360544220311269
    DOI: 10.1016/j.energy.2020.118019
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0360544220311269
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.energy.2020.118019?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Yu, Mengmeng & Hong, Seung Ho, 2017. "Incentive-based demand response considering hierarchical electricity market: A Stackelberg game approach," Applied Energy, Elsevier, vol. 203(C), pages 267-279.
    2. Lian, Renzong & Peng, Jiankun & Wu, Yuankai & Tan, Huachun & Zhang, Hailong, 2020. "Rule-interposing deep reinforcement learning based energy management strategy for power-split hybrid electric vehicle," Energy, Elsevier, vol. 197(C).
    3. Monfared, Houman Jamshidi & Ghasemi, Ahmad & Loni, Abdolah & Marzband, Mousa, 2019. "A hybrid price-based demand response program for the residential micro-grid," Energy, Elsevier, vol. 185(C), pages 274-285.
    4. Lu, Renzhi & Hong, Seung Ho, 2019. "Incentive-based demand response for smart grid with reinforcement learning and deep neural network," Applied Energy, Elsevier, vol. 236(C), pages 937-949.
    5. Feng, Zhong-kai & Niu, Wen-jing & Cheng, Chun-tian & Zhou, Jian-zhong, 2017. "Peak shaving operation of hydro-thermal-nuclear plants serving multiple power grids by linear programming," Energy, Elsevier, vol. 135(C), pages 210-219.
    6. Du, Guodong & Zou, Yuan & Zhang, Xudong & Liu, Teng & Wu, Jinlong & He, Dingbo, 2020. "Deep reinforcement learning based energy management for a hybrid electric vehicle," Energy, Elsevier, vol. 201(C).
    7. Khalili, Tohid & Jafari, Amirreza & Abapour, Mehdi & Mohammadi-Ivatloo, Behnam, 2019. "Optimal battery technology selection and incentive-based demand response program utilization for reliability improvement of an insular microgrid," Energy, Elsevier, vol. 169(C), pages 92-104.
    8. Shahryari, E. & Shayeghi, H. & Mohammadi-ivatloo, B. & Moradzadeh, M., 2018. "An improved incentive-based demand response program in day-ahead and intra-day electricity markets," Energy, Elsevier, vol. 155(C), pages 205-214.
    9. Wang, Yi & Gan, Dahua & Sun, Mingyang & Zhang, Ning & Lu, Zongxiang & Kang, Chongqing, 2019. "Probabilistic individual load forecasting using pinball loss guided LSTM," Applied Energy, Elsevier, vol. 235(C), pages 10-20.
    10. Lu, Renzhi & Hong, Seung Ho & Zhang, Xiongfeng, 2018. "A Dynamic pricing demand response algorithm for smart grid: Reinforcement learning approach," Applied Energy, Elsevier, vol. 220(C), pages 220-230.
    11. Srinivasan, Dipti & Rajgarhia, Sanjana & Radhakrishnan, Bharat Menon & Sharma, Anurag & Khincha, H.P., 2017. "Game-Theory based dynamic pricing strategies for demand side management in smart grids," Energy, Elsevier, vol. 126(C), pages 132-143.
    12. Yu, Mengmeng & Hong, Seung Ho, 2016. "Supply–demand balancing for power management in smart grid: A Stackelberg game approach," Applied Energy, Elsevier, vol. 164(C), pages 702-710.
    13. Rahman, Aowabin & Srikumar, Vivek & Smith, Amanda D., 2018. "Predicting electricity consumption for commercial and residential buildings using deep recurrent neural networks," Applied Energy, Elsevier, vol. 212(C), pages 372-385.
    14. Guo, Zhifeng & Zhou, Kaile & Zhang, Xiaoling & Yang, Shanlin, 2018. "A deep learning model for short-term power load and probability density forecasting," Energy, Elsevier, vol. 160(C), pages 1186-1200.
    15. Haider, Haider Tarish & See, Ong Hang & Elmenreich, Wilfried, 2016. "A review of residential demand response of smart grid," Renewable and Sustainable Energy Reviews, Elsevier, vol. 59(C), pages 166-178.
    16. Vázquez-Canteli, José R. & Nagy, Zoltán, 2019. "Reinforcement learning for demand response: A review of algorithms and modeling techniques," Applied Energy, Elsevier, vol. 235(C), pages 1072-1089.
    17. Yu, Mengmeng & Lu, Renzhi & Hong, Seung Ho, 2016. "A real-time decision model for industrial load management in a smart grid," Applied Energy, Elsevier, vol. 183(C), pages 1488-1497.
    18. Fotouhi Ghazvini, Mohammad Ali & Faria, Pedro & Ramos, Sergio & Morais, Hugo & Vale, Zita, 2015. "Incentive-based demand response programs designed by asset-light retail electricity providers for the day-ahead market," Energy, Elsevier, vol. 82(C), pages 786-799.
    19. Yan, Xing & Ozturk, Yusuf & Hu, Zechun & Song, Yonghua, 2018. "A review on price-driven residential demand response," Renewable and Sustainable Energy Reviews, Elsevier, vol. 96(C), pages 411-419.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Antonopoulos, Ioannis & Robu, Valentin & Couraud, Benoit & Kirli, Desen & Norbu, Sonam & Kiprakis, Aristides & Flynn, David & Elizondo-Gonzalez, Sergio & Wattam, Steve, 2020. "Artificial intelligence and machine learning approaches to energy demand-side response: A systematic review," Renewable and Sustainable Energy Reviews, Elsevier, vol. 130(C).
    2. Xu, Bo & Wang, Jiexin & Guo, Mengyuan & Lu, Jiayu & Li, Gehui & Han, Liang, 2021. "A hybrid demand response mechanism based on real-time incentive and real-time pricing," Energy, Elsevier, vol. 231(C).
    3. Zhou, Kaile & Peng, Ning & Yin, Hui & Hu, Rong, 2023. "Urban virtual power plant operation optimization with incentive-based demand response," Energy, Elsevier, vol. 282(C).
    4. Zhang, Xiongfeng & Lu, Renzhi & Jiang, Junhui & Hong, Seung Ho & Song, Won Seok, 2021. "Testbed implementation of reinforcement learning-based demand response energy management system," Applied Energy, Elsevier, vol. 297(C).
    5. Lu, Renzhi & Hong, Seung Ho, 2019. "Incentive-based demand response for smart grid with reinforcement learning and deep neural network," Applied Energy, Elsevier, vol. 236(C), pages 937-949.
    6. Ibrahim, Muhammad Sohail & Dong, Wei & Yang, Qiang, 2020. "Machine learning driven smart electric power systems: Current trends and new perspectives," Applied Energy, Elsevier, vol. 272(C).
    7. Xu, Fangyuan & Zhu, Weidong & Wang, Yi Fei & Lai, Chun Sing & Yuan, Haoliang & Zhao, Yujia & Guo, Siming & Fu, Zhengxin, 2022. "A new deregulated demand response scheme for load over-shifting city in regulated power market," Applied Energy, Elsevier, vol. 311(C).
    8. Kong, Xiangyu & Kong, Deqian & Yao, Jingtao & Bai, Linquan & Xiao, Jie, 2020. "Online pricing of demand response based on long short-term memory and reinforcement learning," Applied Energy, Elsevier, vol. 271(C).
    9. Zeng, Huibin & Shao, Bilin & Dai, Hongbin & Tian, Ning & Zhao, Wei, 2023. "Incentive-based demand response strategies for natural gas considering carbon emissions and load volatility," Applied Energy, Elsevier, vol. 348(C).
    10. Eduardo J. Salazar & Mauro Jurado & Mauricio E. Samper, 2023. "Reinforcement Learning-Based Pricing and Incentive Strategy for Demand Response in Smart Grids," Energies, MDPI, vol. 16(3), pages 1-33, February.
    11. Tsaousoglou, Georgios & Giraldo, Juan S. & Paterakis, Nikolaos G., 2022. "Market Mechanisms for Local Electricity Markets: A review of models, solution concepts and algorithmic techniques," Renewable and Sustainable Energy Reviews, Elsevier, vol. 156(C).
    12. Lu, Renzhi & Li, Yi-Chang & Li, Yuting & Jiang, Junhui & Ding, Yuemin, 2020. "Multi-agent deep reinforcement learning based demand response for discrete manufacturing systems energy management," Applied Energy, Elsevier, vol. 276(C).
    13. Barja-Martinez, Sara & Aragüés-Peñalba, Mònica & Munné-Collado, Íngrid & Lloret-Gallego, Pau & Bullich-Massagué, Eduard & Villafafila-Robles, Roberto, 2021. "Artificial intelligence techniques for enabling Big Data services in distribution networks: A review," Renewable and Sustainable Energy Reviews, Elsevier, vol. 150(C).
    14. Gao, Jianwei & Ma, Zeyang & Guo, Fengjia, 2019. "The influence of demand response on wind-integrated power system considering participation of the demand side," Energy, Elsevier, vol. 178(C), pages 723-738.
    15. Omar Al-Ani & Sanjoy Das, 2022. "Reinforcement Learning: Theory and Applications in HEMS," Energies, MDPI, vol. 15(17), pages 1-37, September.
    16. Correa-Jullian, Camila & López Droguett, Enrique & Cardemil, José Miguel, 2020. "Operation scheduling in a solar thermal system: A reinforcement learning-based framework," Applied Energy, Elsevier, vol. 268(C).
    17. Lin, Jin & Dong, Jun & Liu, Dongran & Zhang, Yaoyu & Ma, Tongtao, 2022. "From peak shedding to low-carbon transitions: Customer psychological factors in demand response," Energy, Elsevier, vol. 238(PA).
    18. Pinto, Giuseppe & Deltetto, Davide & Capozzoli, Alfonso, 2021. "Data-driven district energy management with surrogate models and deep reinforcement learning," Applied Energy, Elsevier, vol. 304(C).
    19. Lu, Renzhi & Bai, Ruichang & Ding, Yuemin & Wei, Min & Jiang, Junhui & Sun, Mingyang & Xiao, Feng & Zhang, Hai-Tao, 2021. "A hybrid deep learning-based online energy management scheme for industrial microgrid," Applied Energy, Elsevier, vol. 304(C).
    20. Qi, Chunyang & Zhu, Yiwen & Song, Chuanxue & Yan, Guangfu & Xiao, Feng & Da wang, & Zhang, Xu & Cao, Jingwei & Song, Shixin, 2022. "Hierarchical reinforcement learning based energy management strategy for hybrid electric vehicle," Energy, Elsevier, vol. 238(PA).

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:energy:v:205:y:2020:i:c:s0360544220311269. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.journals.elsevier.com/energy .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.