IDEAS home Printed from https://ideas.repec.org/a/wly/intnem/v32y2022i6ne2211.html
   My bibliography  Save this article

FGLB: A fine‐grained hardware intra‐server load balancer based on 100 G FPGA SmartNIC

Author

Listed:
  • Xiaoying Huang
  • Zhichuan Guo
  • Mangu Song

Abstract

In today's data centers, workloads including multiple services and requests are processed in parallel within a server with many CPU cores. Therefore, meeting the intra‐server load balancing is very important to improve the utilization of CPU resources in the data centers. However, the existing methods cannot well meet the intra‐server load balancing in high‐throughput scenarios. The software‐based methods generally utilize CPU cores to parse and dispatch packets. They work well at low throughput, but they have high CPU overhead at high throughput, leading to packet loss and high latency issues. The hardware‐based methods parse the packet and compute a hash over its metadata in hardware and perform load balancing in a coarse‐grained manner based on the hash value. They have the ability to work at high throughput with the advantage of low overhead but are less well in balance effect and flexibility. We, therefore, propose an intra‐server load balancer based on the reconfigurable hardware, FPGA, to meet the requirements for load balancing within servers in high‐speed application scenarios. Our method improves the load‐balancing gran‐ ularity of hardware‐based method. It not only has high throughput but also has a good balance effect and flexibility. We implemented and evaluated our method on a 100 G FPGA SmartNIC. The evaluation result shows our method can reduce the load imbalance ratio by an order of magnitude when the distribution of flow size is uneven compared to the current widely used hardware‐based method.

Suggested Citation

  • Xiaoying Huang & Zhichuan Guo & Mangu Song, 2022. "FGLB: A fine‐grained hardware intra‐server load balancer based on 100 G FPGA SmartNIC," International Journal of Network Management, John Wiley & Sons, vol. 32(6), November.
  • Handle: RePEc:wly:intnem:v:32:y:2022:i:6:n:e2211
    DOI: 10.1002/nem.2211
    as

    Download full text from publisher

    File URL: https://doi.org/10.1002/nem.2211
    Download Restriction: no

    File URL: https://libkey.io/10.1002/nem.2211?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:wly:intnem:v:32:y:2022:i:6:n:e2211. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Wiley Content Delivery (email available below). General contact details of provider: https://doi.org/10.1002/(ISSN)1099-1190 .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.