IDEAS home Printed from https://ideas.repec.org/a/plo/pone00/0336903.html
   My bibliography  Save this article

Energy-Harvesting Reinforcement Learning-based Offloading Decision Algorithm for Mobile Edge Computing Networks (EHRL)

Author

Listed:
  • Hend Bayoumi
  • Nahla B Abdel-Hamid
  • Amr MT Ali-Eldin
  • Labib M Labib

Abstract

Mobile Edge Computing (MEC) is a computational paradigm that brings resources closer to the network edge to provide fast and efficient computing services for Mobile Devices (MDs). However, MDs are often constrained by limited energy and computational resources, which are insufficient to handle the high number of tasks. The problems of limited energy resources and the low computing capability of wireless nodes have led to the emergence of Wireless Power Transfer (WPT) and Energy Harvesting (EH) as a potential solution where electrical energy is transmitted wirelessly and then harvested by MDs and converted into power. This paper considers a wireless-powered MEC network employing a binary offloading policy, in which the computation tasks of MDs are either executed locally or fully offloaded to an edge server (ES). The objective is to optimize binary offloading decisions under dynamic wireless channel conditions and energy harvesting constraints. Hence, an Energy-Harvesting Reinforcement Learning-based Offloading Decision Algorithm (EHRL) is proposed. EHRL integrates Reinforcement Learning (RL) with Deep Neural Networks (DNNs) to dynamically optimize binary offloading decisions, which in turn obviates the requirement for manually labeled training data and thus avoids the need for solving complex optimization problems repeatedly. To enhance the offloading decision-making process, the algorithm incorporates the Newton-Raphson method for fast and efficient optimization of the computation rate under energy constraints. Simultaneously, the DNN is trained using the Nadam optimizer (Nesterov-accelerated Adaptive Moment Estimation), which combines the benefits of Adam and Nesterov momentum, offering improved convergence speed and training stability. The proposed algorithm addresses the dual challenges of limited energy availability in MDs and the need for efficient task offloading to minimize latency and maximize computational performance. Numerical results validate the superiority of the proposed approach, demonstrating significant gains in computation performance and time efficiency compared to conventional techniques, making real-time and optimal offloading design truly viable even in a fast-fading environment.

Suggested Citation

  • Hend Bayoumi & Nahla B Abdel-Hamid & Amr MT Ali-Eldin & Labib M Labib, 2025. "Energy-Harvesting Reinforcement Learning-based Offloading Decision Algorithm for Mobile Edge Computing Networks (EHRL)," PLOS ONE, Public Library of Science, vol. 20(11), pages 1-18, November.
  • Handle: RePEc:plo:pone00:0336903
    DOI: 10.1371/journal.pone.0336903
    as

    Download full text from publisher

    File URL: https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0336903
    Download Restriction: no

    File URL: https://journals.plos.org/plosone/article/file?id=10.1371/journal.pone.0336903&type=printable
    Download Restriction: no

    File URL: https://libkey.io/10.1371/journal.pone.0336903?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:plo:pone00:0336903. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: plosone (email available below). General contact details of provider: https://journals.plos.org/plosone/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.