IDEAS home Printed from https://ideas.repec.org/a/gam/jeners/v15y2022i19p6992-d923403.html
   My bibliography  Save this article

The Neural Network Classifier Works Efficiently on Searching in DQN Using the Autonomous Internet of Things Hybridized by the Metaheuristic Techniques to Reduce the EVs’ Service Scheduling Time

Author

Listed:
  • Ahmed M. Abed

    (Department of Industrial Engineering, College of Engineering, Prince Sattam Bin Abdulaziz University, Alkharj 16273, Saudi Arabia
    Industrial Engineering Department, Zagazig University, Zagazig 44519, Egypt)

  • Ali AlArjani

    (Department of Industrial Engineering, College of Engineering, Prince Sattam Bin Abdulaziz University, Alkharj 16273, Saudi Arabia)

Abstract

Since the rules and regulations strongly emphasize environmental preservation and greenhouse gas GHG reduction, researchers have progressively noticed a shift in the transportation means toward electromobility. Several challenges must be resolved to deploy EVs, beginning with improving network accessibility and bidirectional interoperability, reducing the uncertainty related to the availability of suitable charging stations on the trip path and reducing the total service time. Therefore, suggesting DQN supported by AIoT to pair EVs’ requests and station invitations to reduce idle queueing time is crucial for long travel distances. The author has written a proposed methodology in MATLAB to address significant parameters such as the battery charge level, trip distance, nearby charging stations, and average service time. The effectiveness of the proposed methodology is derived from hybridizing the meta-heuristic techniques in searching DQN learning steps to obtain a solution quickly and improve the servicing time by 34%, after solving various EV charging scheduling difficulties and congestion control and enabling EV drivers to policy extended trips. The work results obtained from more than 2145 training hypothetical examples for EVs’ requests were compared with the Bayesian Normalized Neural Network (BASNNC) algorithm, which hybridize the Beetle Antennae Search and Neural Network Classifier, and with other methods such as Grey Wolf Optimization (GWO) and Sine-cosine and Whale optimization, revealing that the mean overall comparison efficiencies in error reduction were 72.75%, 58.7%, and 18.2% respectively.

Suggested Citation

  • Ahmed M. Abed & Ali AlArjani, 2022. "The Neural Network Classifier Works Efficiently on Searching in DQN Using the Autonomous Internet of Things Hybridized by the Metaheuristic Techniques to Reduce the EVs’ Service Scheduling Time," Energies, MDPI, vol. 15(19), pages 1-25, September.
  • Handle: RePEc:gam:jeners:v:15:y:2022:i:19:p:6992-:d:923403
    as

    Download full text from publisher

    File URL: https://www.mdpi.com/1996-1073/15/19/6992/pdf
    Download Restriction: no

    File URL: https://www.mdpi.com/1996-1073/15/19/6992/
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. Zhang, Xu & Peng, Linyu & Cao, Yue & Liu, Shuohan & Zhou, Huan & Huang, Keli, 2020. "Towards holistic charging management for urban electric taxi via a hybrid deployment of battery charging and swap stations," Renewable Energy, Elsevier, vol. 155(C), pages 703-716.
    2. Zhang, Jing & Yan, Jie & Liu, Yongqian & Zhang, Haoran & Lv, Guoliang, 2020. "Daily electric vehicle charging load profiles considering demographics of vehicle users," Applied Energy, Elsevier, vol. 274(C).
    3. Yongguang Liu & Wei Chen & Zhu Huang & Mohamed El Ghami, 2021. "Reinforcement Learning-Based Multiple Constraint Electric Vehicle Charging Service Scheduling," Mathematical Problems in Engineering, Hindawi, vol. 2021, pages 1-12, November.
    4. Sunyong Kim & Hyuk Lim, 2018. "Reinforcement Learning Based Energy Management Algorithm for Smart Energy Buildings," Energies, MDPI, vol. 11(8), pages 1-19, August.
    5. Yvenn Amara-Ouali & Yannig Goude & Pascal Massart & Jean-Michel Poggi & Hui Yan, 2021. "A Review of Electric Vehicle Load Open Data and Models," Energies, MDPI, vol. 14(8), pages 1-35, April.
    6. Aritra Ghosh, 2020. "Possibilities and Challenges for the Inclusion of the Electric Vehicle (EV) to Reduce the Carbon Footprint in the Transport Sector: A Review," Energies, MDPI, vol. 13(10), pages 1-22, May.
    7. Volodymyr Mnih & Koray Kavukcuoglu & David Silver & Andrei A. Rusu & Joel Veness & Marc G. Bellemare & Alex Graves & Martin Riedmiller & Andreas K. Fidjeland & Georg Ostrovski & Stig Petersen & Charle, 2015. "Human-level control through deep reinforcement learning," Nature, Nature, vol. 518(7540), pages 529-533, February.
    8. Wang, Jian Qi & Du, Yu & Wang, Jing, 2020. "LSTM based long-term energy consumption prediction with periodicity," Energy, Elsevier, vol. 197(C).
    9. Ma, Tai-Yu & Faye, Sébastien, 2022. "Multistep electric vehicle charging station occupancy prediction using hybrid LSTM neural networks," Energy, Elsevier, vol. 244(PB).
    10. Felipe Condon Silva & Mohamed A. Ahmed & José Manuel Martínez & Young-Chon Kim, 2019. "Design and Implementation of a Blockchain-Based Energy Trading Platform for Electric Vehicles in Smart Campus Parking Lots," Energies, MDPI, vol. 12(24), pages 1-25, December.
    11. Tuchnitz, Felix & Ebell, Niklas & Schlund, Jonas & Pruckner, Marco, 2021. "Development and Evaluation of a Smart Charging Strategy for an Electric Vehicle Fleet Based on Reinforcement Learning," Applied Energy, Elsevier, vol. 285(C).
    12. Ying Ji & Jianhui Wang & Jiacan Xu & Xiaoke Fang & Huaguang Zhang, 2019. "Real-Time Energy Management of a Microgrid Using Deep Reinforcement Learning," Energies, MDPI, vol. 12(12), pages 1-21, June.
    13. Ruisheng Wang & Zhong Chen & Qiang Xing & Ziqi Zhang & Tian Zhang, 2022. "A Modified Rainbow-Based Deep Reinforcement Learning Method for Optimal Scheduling of Charging Station," Sustainability, MDPI, vol. 14(3), pages 1-14, February.
    14. Ki-Beom Lee & Mohamed A. Ahmed & Dong-Ki Kang & Young-Chon Kim, 2020. "Deep Reinforcement Learning Based Optimal Route and Charging Station Selection," Energies, MDPI, vol. 13(23), pages 1-22, November.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Ki-Beom Lee & Mohamed A. Ahmed & Dong-Ki Kang & Young-Chon Kim, 2020. "Deep Reinforcement Learning Based Optimal Route and Charging Station Selection," Energies, MDPI, vol. 13(23), pages 1-22, November.
    2. Ritu Kandari & Neeraj Neeraj & Alexander Micallef, 2022. "Review on Recent Strategies for Integrating Energy Storage Systems in Microgrids," Energies, MDPI, vol. 16(1), pages 1-24, December.
    3. Omar Al-Ani & Sanjoy Das, 2022. "Reinforcement Learning: Theory and Applications in HEMS," Energies, MDPI, vol. 15(17), pages 1-37, September.
    4. Yujian Ye & Dawei Qiu & Huiyu Wang & Yi Tang & Goran Strbac, 2021. "Real-Time Autonomous Residential Demand Response Management Based on Twin Delayed Deep Deterministic Policy Gradient Learning," Energies, MDPI, vol. 14(3), pages 1-22, January.
    5. Dimitrios Vamvakas & Panagiotis Michailidis & Christos Korkas & Elias Kosmatopoulos, 2023. "Review and Evaluation of Reinforcement Learning Frameworks on Smart Grid Applications," Energies, MDPI, vol. 16(14), pages 1-38, July.
    6. Qiu, Dawei & Wang, Yi & Hua, Weiqi & Strbac, Goran, 2023. "Reinforcement learning for electric vehicle applications in power systems:A critical review," Renewable and Sustainable Energy Reviews, Elsevier, vol. 173(C).
    7. Alqahtani, Mohammed & Hu, Mengqi, 2022. "Dynamic energy scheduling and routing of multiple electric vehicles using deep reinforcement learning," Energy, Elsevier, vol. 244(PA).
    8. Yuhong Wang & Lei Chen & Hong Zhou & Xu Zhou & Zongsheng Zheng & Qi Zeng & Li Jiang & Liang Lu, 2021. "Flexible Transmission Network Expansion Planning Based on DQN Algorithm," Energies, MDPI, vol. 14(7), pages 1-21, April.
    9. Yang, Ting & Zhao, Liyuan & Li, Wei & Zomaya, Albert Y., 2021. "Dynamic energy dispatch strategy for integrated energy system based on improved deep reinforcement learning," Energy, Elsevier, vol. 235(C).
    10. Wang, Yi & Qiu, Dawei & Sun, Mingyang & Strbac, Goran & Gao, Zhiwei, 2023. "Secure energy management of multi-energy microgrid: A physical-informed safe reinforcement learning approach," Applied Energy, Elsevier, vol. 335(C).
    11. Bio Gassi, Karim & Baysal, Mustafa, 2023. "Improving real-time energy decision-making model with an actor-critic agent in modern microgrids with energy storage devices," Energy, Elsevier, vol. 263(PE).
    12. Rae-Jun Park & Kyung-Bin Song & Bo-Sung Kwon, 2020. "Short-Term Load Forecasting Algorithm Using a Similar Day Selection Method Based on Reinforcement Learning," Energies, MDPI, vol. 13(10), pages 1-19, May.
    13. Denis Sidorov & Daniil Panasetsky & Nikita Tomin & Dmitriy Karamov & Aleksei Zhukov & Ildar Muftahov & Aliona Dreglea & Fang Liu & Yong Li, 2020. "Toward Zero-Emission Hybrid AC/DC Power Systems with Renewable Energy Sources and Storages: A Case Study from Lake Baikal Region," Energies, MDPI, vol. 13(5), pages 1-18, March.
    14. Fescioglu-Unver, Nilgun & Yıldız Aktaş, Melike, 2023. "Electric vehicle charging service operations: A review of machine learning applications for infrastructure planning, control, pricing and routing," Renewable and Sustainable Energy Reviews, Elsevier, vol. 188(C).
    15. Shubham Mishra & Shrey Verma & Subhankar Chowdhury & Ambar Gaur & Subhashree Mohapatra & Gaurav Dwivedi & Puneet Verma, 2021. "A Comprehensive Review on Developments in Electric Vehicle Charging Station Infrastructure and Present Scenario of India," Sustainability, MDPI, vol. 13(4), pages 1-20, February.
    16. Grace Muriithi & Sunetra Chowdhury, 2021. "Optimal Energy Management of a Grid-Tied Solar PV-Battery Microgrid: A Reinforcement Learning Approach," Energies, MDPI, vol. 14(9), pages 1-24, May.
    17. Lilia Tightiz & Joon Yoo, 2022. "A Review on a Data-Driven Microgrid Management System Integrating an Active Distribution Network: Challenges, Issues, and New Trends," Energies, MDPI, vol. 15(22), pages 1-24, November.
    18. Alexandra Märtz & Uwe Langenmayr & Sabrina Ried & Katrin Seddig & Patrick Jochem, 2022. "Charging Behavior of Electric Vehicles: Temporal Clustering Based on Real-World Data," Energies, MDPI, vol. 15(18), pages 1-26, September.
    19. Perera, A.T.D. & Kamalaruban, Parameswaran, 2021. "Applications of reinforcement learning in energy systems," Renewable and Sustainable Energy Reviews, Elsevier, vol. 137(C).
    20. Ahmed Ismail & Mustafa Baysal, 2023. "Dynamic Pricing Based on Demand Response Using Actor–Critic Agent Reinforcement Learning," Energies, MDPI, vol. 16(14), pages 1-19, July.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:gam:jeners:v:15:y:2022:i:19:p:6992-:d:923403. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: MDPI Indexing Manager (email available below). General contact details of provider: https://www.mdpi.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.