Author
Listed:
- Snehal Chaflekar
- Rajendra Rewatkar
Abstract
Load balancing in microservice architecture is essential for optimizing resource utilization and maintaining high availability. Traditional load balancing algorithms like First-Come-First-Serve (FCFS) and Round Robin often lead to inefficiencies due to their inability to account for server capabilities and varying request sizes. Machine Learning (ML) offers a promising solution by predicting future load patterns and distributing requests more effectively. In this study, we propose an innovative, novel attention mechanism-based Long Short-Term Memory (LSTM) network for web server load prediction. Our methodology involves detailed data preprocessing, sequence creation, and scaling to prepare the NASA HTTP dataset for model training. The attention mechanism enhances the LSTM network’s ability to focus on relevant input sequences, significantly improving predictive accuracy. Compared to traditional algorithms such as linear regression, polynomial regression, L2 regularization, decision tree regression, XGBoost, and ARIMA, our model achieves the lowest Mean Squared Error (MSE) of 187,293.59 and Root Mean Squared Error (RMSE) of 432.77, with a strong R-squared value of 0.8532. This superior performance highlights the model’s effectiveness in capturing both short-term and long-term dependencies in the data. This novel predictive model can be used to integrate into dynamic and efficient load balancing frameworks. Accurate future load predictions from AMDLN in the microservices environment optimize resource utilization and save infrastructure costs by providing accurate future load predictions for scaling up and scaling down of microservices.
Suggested Citation
Snehal Chaflekar & Rajendra Rewatkar, 2025.
"Novel load prediction in microservice architecture using attention mechanism-based deep LSTM networks,"
International Journal of Innovative Research and Scientific Studies, Innovative Research Publishing, vol. 8(3), pages 1046-1058.
Handle:
RePEc:aac:ijirss:v:8:y:2025:i:3:p:1046-1058:id:6751
Download full text from publisher
Corrections
All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:aac:ijirss:v:8:y:2025:i:3:p:1046-1058:id:6751. See general information about how to correct material in RePEc.
If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.
We have no bibliographic references for this item. You can help adding them by using this form .
If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Natalie Jean (email available below). General contact details of provider: https://ijirss.com/index.php/ijirss/ .
Please note that corrections may take a couple of weeks to filter through
the various RePEc services.