IDEAS home Printed from https://ideas.repec.org/a/eee/appene/v236y2019icp937-949.html
   My bibliography  Save this article

Incentive-based demand response for smart grid with reinforcement learning and deep neural network

Author

Listed:
  • Lu, Renzhi
  • Hong, Seung Ho

Abstract

Balancing electricity generation and consumption is essential for smoothing the power grids. Any mismatch between energy supply and demand would increase costs to both the service provider and customers and may even cripple the entire grid. This paper proposes a novel real-time incentive-based demand response algorithm for smart grid systems with reinforcement learning and deep neural network, aiming to help the service provider to purchase energy resources from its subscribed customers to balance energy fluctuations and enhance grid reliability. In particular, to overcome the future uncertainties, deep neural network is used to predict the unknown prices and energy demands. After that, reinforcement learning is adopted to obtain the optimal incentive rates for different customers considering the profits of both service provider and customers. Simulation results show that this proposed incentive-based demand response algorithm induces demand side participation, promotes service provider and customers profitabilities, and improves system reliability by balancing energy resources, which can be regarded as a win-win strategy for both service provider and customers.

Suggested Citation

  • Lu, Renzhi & Hong, Seung Ho, 2019. "Incentive-based demand response for smart grid with reinforcement learning and deep neural network," Applied Energy, Elsevier, vol. 236(C), pages 937-949.
  • Handle: RePEc:eee:appene:v:236:y:2019:i:c:p:937-949
    DOI: 10.1016/j.apenergy.2018.12.061
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0306261918318798
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.apenergy.2018.12.061?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Yu, Mengmeng & Hong, Seung Ho, 2017. "Incentive-based demand response considering hierarchical electricity market: A Stackelberg game approach," Applied Energy, Elsevier, vol. 203(C), pages 267-279.
    2. Lu, Renzhi & Hong, Seung Ho & Zhang, Xiongfeng, 2018. "A Dynamic pricing demand response algorithm for smart grid: Reinforcement learning approach," Applied Energy, Elsevier, vol. 220(C), pages 220-230.
    3. Nojavan, Sayyad & Zare, Kazem & Mohammadi-Ivatloo, Behnam, 2017. "Optimal stochastic energy management of retailer based on selling price determination under smart grid environment in the presence of demand response program," Applied Energy, Elsevier, vol. 187(C), pages 449-464.
    4. Yu, Mengmeng & Hong, Seung Ho, 2016. "Supply–demand balancing for power management in smart grid: A Stackelberg game approach," Applied Energy, Elsevier, vol. 164(C), pages 702-710.
    5. Fotouhi Ghazvini, Mohammad Ali & Soares, João & Horta, Nuno & Neves, Rui & Castro, Rui & Vale, Zita, 2015. "A multi-objective model for scheduling of short-term incentive-based demand response programs offered by electricity retailers," Applied Energy, Elsevier, vol. 151(C), pages 102-118.
    6. Wu, Jingda & He, Hongwen & Peng, Jiankun & Li, Yuecheng & Li, Zhanjiang, 2018. "Continuous reinforcement learning of energy management with deep Q network for a power split hybrid electric bus," Applied Energy, Elsevier, vol. 222(C), pages 799-811.
    7. Panapakidis, Ioannis P. & Dagoumas, Athanasios S., 2016. "Day-ahead electricity price forecasting via the application of artificial neural network based models," Applied Energy, Elsevier, vol. 172(C), pages 132-151.
    8. Xiong, Rui & Cao, Jiayi & Yu, Quanqing, 2018. "Reinforcement learning-based real-time power management for hybrid energy storage system in the plug-in hybrid electric vehicle," Applied Energy, Elsevier, vol. 211(C), pages 538-548.
    9. Hernández, Luis & Baladrón, Carlos & Aguiar, Javier M. & Carro, Belén & Sánchez-Esguevillas, Antonio & Lloret, Jaime, 2014. "Artificial neural networks for short-term load forecasting in microgrids environment," Energy, Elsevier, vol. 75(C), pages 252-264.
    10. Vuelvas, José & Ruiz, Fredy & Gruosso, Giambattista, 2018. "Limiting gaming opportunities on incentive-based demand response programs," Applied Energy, Elsevier, vol. 225(C), pages 668-681.
    11. Volodymyr Mnih & Koray Kavukcuoglu & David Silver & Andrei A. Rusu & Joel Veness & Marc G. Bellemare & Alex Graves & Martin Riedmiller & Andreas K. Fidjeland & Georg Ostrovski & Stig Petersen & Charle, 2015. "Human-level control through deep reinforcement learning," Nature, Nature, vol. 518(7540), pages 529-533, February.
    12. Alasseri, Rajeev & Tripathi, Ashish & Joji Rao, T. & Sreekanth, K.J., 2017. "A review on implementation strategies for demand side management (DSM) in Kuwait through incentive-based demand response programs," Renewable and Sustainable Energy Reviews, Elsevier, vol. 77(C), pages 617-635.
    13. Seunghyoung Ryu & Jaekoo Noh & Hongseok Kim, 2016. "Deep Neural Network Based Demand Side Short Term Load Forecasting," Energies, MDPI, vol. 10(1), pages 1-20, December.
    14. Ghasemi, A. & Shayeghi, H. & Moradzadeh, M. & Nooshyar, M., 2016. "A novel hybrid algorithm for electricity price and load forecasting in smart grids with demand-side management," Applied Energy, Elsevier, vol. 177(C), pages 40-59.
    15. Yu, Mengmeng & Lu, Renzhi & Hong, Seung Ho, 2016. "A real-time decision model for industrial load management in a smart grid," Applied Energy, Elsevier, vol. 183(C), pages 1488-1497.
    16. Fotouhi Ghazvini, Mohammad Ali & Faria, Pedro & Ramos, Sergio & Morais, Hugo & Vale, Zita, 2015. "Incentive-based demand response programs designed by asset-light retail electricity providers for the day-ahead market," Energy, Elsevier, vol. 82(C), pages 786-799.
    17. Shen, Bo & Ghatikar, Girish & Lei, Zeng & Li, Jinkai & Wikler, Greg & Martin, Phil, 2014. "The role of regulatory reforms, market changes, and technology development to make demand response a viable resource in meeting energy challenges," Applied Energy, Elsevier, vol. 130(C), pages 814-823.
    18. Jin, Ming & Feng, Wei & Marnay, Chris & Spanos, Costas, 2018. "Microgrid to enable optimal distributed energy retail and end-user demand response," Applied Energy, Elsevier, vol. 210(C), pages 1321-1335.
    19. Keles, Dogan & Scelle, Jonathan & Paraschiv, Florentina & Fichtner, Wolf, 2016. "Extended forecast methods for day-ahead electricity spot prices applying artificial neural networks," Applied Energy, Elsevier, vol. 162(C), pages 218-230.
    20. Weitzel, Timm & Glock, C. H., 2019. "Scheduling a Storage-Augmented Discrete Production Facility under Incentive-based Demand Response," Publications of Darmstadt Technical University, Institute for Business Studies (BWL) 103373, Darmstadt Technical University, Department of Business Administration, Economics and Law, Institute for Business Studies (BWL).
    21. Eissa, M.M., 2018. "First time real time incentive demand response program in smart grid with “i-Energy” management system with different resources," Applied Energy, Elsevier, vol. 212(C), pages 607-621.
    22. Zhe Luo & Seung-Ho Hong & Jong-Beom Kim, 2016. "A Price-Based Demand Response Scheme for Discrete Manufacturing in Smart Grids," Energies, MDPI, vol. 9(8), pages 1-18, August.
    23. Wang, Deyun & Luo, Hongyuan & Grunder, Olivier & Lin, Yanbing & Guo, Haixiang, 2017. "Multi-step ahead electricity price forecasting using a hybrid model based on two-layer decomposition technique and BP neural network optimized by firefly algorithm," Applied Energy, Elsevier, vol. 190(C), pages 390-407.
    24. Luis Hernández & Carlos Baladrón & Javier M. Aguiar & Lorena Calavia & Belén Carro & Antonio Sánchez-Esguevillas & Francisco Pérez & Ángel Fernández & Jaime Lloret, 2014. "Artificial Neural Network for Short-Term Load Forecasting in Distribution Systems," Energies, MDPI, vol. 7(3), pages 1-23, March.
    25. Dusparic, Ivana & Taylor, Adam & Marinescu, Andrei & Golpayegani, Fatemeh & Clarke, Siobhan, 2017. "Residential demand response: Experimental evaluation and comparison of self-organizing techniques," Renewable and Sustainable Energy Reviews, Elsevier, vol. 80(C), pages 1528-1536.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Lu, Renzhi & Hong, Seung Ho & Zhang, Xiongfeng, 2018. "A Dynamic pricing demand response algorithm for smart grid: Reinforcement learning approach," Applied Energy, Elsevier, vol. 220(C), pages 220-230.
    2. Zhou, Kaile & Peng, Ning & Yin, Hui & Hu, Rong, 2023. "Urban virtual power plant operation optimization with incentive-based demand response," Energy, Elsevier, vol. 282(C).
    3. Wen, Lulu & Zhou, Kaile & Li, Jun & Wang, Shanyong, 2020. "Modified deep learning and reinforcement learning for an incentive-based demand response model," Energy, Elsevier, vol. 205(C).
    4. Antonopoulos, Ioannis & Robu, Valentin & Couraud, Benoit & Kirli, Desen & Norbu, Sonam & Kiprakis, Aristides & Flynn, David & Elizondo-Gonzalez, Sergio & Wattam, Steve, 2020. "Artificial intelligence and machine learning approaches to energy demand-side response: A systematic review," Renewable and Sustainable Energy Reviews, Elsevier, vol. 130(C).
    5. Díaz, Guzmán & Coto, José & Gómez-Aleixandre, Javier, 2019. "Prediction and explanation of the formation of the Spanish day-ahead electricity price through machine learning regression," Applied Energy, Elsevier, vol. 239(C), pages 610-625.
    6. Xu, Bo & Wang, Jiexin & Guo, Mengyuan & Lu, Jiayu & Li, Gehui & Han, Liang, 2021. "A hybrid demand response mechanism based on real-time incentive and real-time pricing," Energy, Elsevier, vol. 231(C).
    7. Ma, Siyu & Liu, Hui & Wang, Ni & Huang, Lidong & Goh, Hui Hwang, 2023. "Incentive-based demand response under incomplete information based on the deep deterministic policy gradient," Applied Energy, Elsevier, vol. 351(C).
    8. Konstantakopoulos, Ioannis C. & Barkan, Andrew R. & He, Shiying & Veeravalli, Tanya & Liu, Huihan & Spanos, Costas, 2019. "A deep learning and gamification approach to improving human-building interaction and energy efficiency in smart infrastructure," Applied Energy, Elsevier, vol. 237(C), pages 810-821.
    9. Fan, Songli & Ai, Qian & Piao, Longjian, 2018. "Bargaining-based cooperative energy trading for distribution company and demand response," Applied Energy, Elsevier, vol. 226(C), pages 469-482.
    10. Jasiński, Tomasz, 2020. "Use of new variables based on air temperature for forecasting day-ahead spot electricity prices using deep neural networks: A new approach," Energy, Elsevier, vol. 213(C).
    11. Lago, Jesus & De Ridder, Fjo & Vrancx, Peter & De Schutter, Bart, 2018. "Forecasting day-ahead electricity prices in Europe: The importance of considering market integration," Applied Energy, Elsevier, vol. 211(C), pages 890-903.
    12. Sun, Alexander Y., 2020. "Optimal carbon storage reservoir management through deep reinforcement learning," Applied Energy, Elsevier, vol. 278(C).
    13. Daniel Egan & Qilun Zhu & Robert Prucka, 2023. "A Review of Reinforcement Learning-Based Powertrain Controllers: Effects of Agent Selection for Mixed-Continuity Control and Reward Formulation," Energies, MDPI, vol. 16(8), pages 1-31, April.
    14. Yuchun Li & Yinghua Han & Jinkuan Wang & Qiang Zhao, 2018. "A MBCRF Algorithm Based on Ensemble Learning for Building Demand Response Considering the Thermal Comfort," Energies, MDPI, vol. 11(12), pages 1-20, December.
    15. Wu, Yuankai & Tan, Huachun & Peng, Jiankun & Zhang, Hailong & He, Hongwen, 2019. "Deep reinforcement learning of energy management with continuous control strategy and traffic information for a series-parallel plug-in hybrid electric bus," Applied Energy, Elsevier, vol. 247(C), pages 454-466.
    16. Fontecha, John E. & Nikolaev, Alexander & Walteros, Jose L. & Zhu, Zhenduo, 2022. "Scientists wanted? A literature review on incentive programs that promote pro-environmental consumer behavior: Energy, waste, and water," Socio-Economic Planning Sciences, Elsevier, vol. 82(PA).
    17. Lu, Renzhi & Bai, Ruichang & Ding, Yuemin & Wei, Min & Jiang, Junhui & Sun, Mingyang & Xiao, Feng & Zhang, Hai-Tao, 2021. "A hybrid deep learning-based online energy management scheme for industrial microgrid," Applied Energy, Elsevier, vol. 304(C).
    18. Mohseni, Soheil & Brent, Alan C. & Kelly, Scott & Browne, Will N., 2022. "Demand response-integrated investment and operational planning of renewable and sustainable energy systems considering forecast uncertainties: A systematic review," Renewable and Sustainable Energy Reviews, Elsevier, vol. 158(C).
    19. Park, Keonwoo & Moon, Ilkyeong, 2022. "Multi-agent deep reinforcement learning approach for EV charging scheduling in a smart grid," Applied Energy, Elsevier, vol. 328(C).
    20. Léonard Tschora & Erwan Pierre & Marc Plantevit & Céline Robardet, 2022. "Electricity price forecasting on the day-ahead market using machine learning," Post-Print hal-03621974, HAL.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:appene:v:236:y:2019:i:c:p:937-949. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/wps/find/journaldescription.cws_home/405891/description#description .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.