Author
Listed:
- Pritam Paral
(IIEST, Department of Electrical Engineering)
- Amitava Chatterjee
(Jadavpur University, Department of Electrical Engineering)
- Patrick Siarry
(Université Paris-Est Créteil)
Abstract
The optimization of hyperparameters and the learning process associated with supervised machine learning models, including artificial neural networks and deep learning architectures, are viewed as among the most formidable challenges in machine learning. Several prior investigations have applied gradient-based backpropagation approaches in the training of deep learning architectures. However, gradient-based methods exhibit notable disadvantages, such as the tendency to become trapped in local minima when dealing with multi-objective cost functions, extensive calculations of gradient information with numerous iterations, and requiring the cost functions to maintain continuity. Given that the training process for a diverse array of supervised machine learning techniques is identified as an NP-hard optimization problem, there has been a considerable surge in the adoption of heuristic or metaheuristic algorithms aimed at optimizing the structures and parameters of these approaches. In the context of optimization, metaheuristic optimizers are designed to identify the optimal estimations for different components of supervised machine learning (e.g., weights, hyperparameters, the number of layers or neurons, and the learning rate in case of deep learning models, or hyperparameters for Gaussian processes in Gaussian process regression (GPR), etc.). Many researchers are inclined to extend innovative hybrid algorithms by integrating metaheuristic algorithms to optimize the hyperparameters of supervised machine learning models. The evolution of hybrid metaheuristics aids in boosting algorithm performance and is proficient in addressing complex optimization problems. In general, the optimal functioning of the metaheuristics ought to attain a favorable compromise between the exploration and exploitation characteristics. This chapter has comprehensively reviewed two recent advancements in the application of metaheuristic optimizers in supervised machine learning approaches, which are distinctly diverse in their characteristics. To begin with, this chapter offers a detailed review of a modern hybrid intrusion detection technique that employs a black widow-optimized convolutional long short-term memory neural network. Following this, the authors provide an in-depth analysis of a recent investigation focused on the metaheuristic optimization of hyperparameters for the GPR model associated with FEREBUS, a GPR engine embedded within the extensive framework of FFLUX, which is recognized as an innovative machine-learned force field. The integration of metaheuristics with supervised machine learning models is projected to expedite the training process in the forthcoming years. However, the concept of evolutionary hybrid architecture remains underutilized in the existing literature, with relevant publications being relatively uncommon.
Suggested Citation
Pritam Paral & Amitava Chatterjee & Patrick Siarry, 2025.
"Innovative Applications of Metaheuristics to Supervised Machine Learning,"
Springer Books, in: Rafael Martí & Panos M. Pardalos & Mauricio G.C. Resende (ed.), Handbook of Heuristics, edition 0, chapter 7, pages 147-176,
Springer.
Handle:
RePEc:spr:sprchp:978-3-032-00385-0_66
DOI: 10.1007/978-3-032-00385-0_66
Download full text from publisher
To our knowledge, this item is not available for
download. To find whether it is available, there are three
options:
1. Check below whether another version of this item is available online.
2. Check on the provider's
web page
whether it is in fact available.
3. Perform a
for a similarly titled item that would be
available.
Corrections
All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:sprchp:978-3-032-00385-0_66. See general information about how to correct material in RePEc.
If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.
We have no bibliographic references for this item. You can help adding them by using this form .
If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .
Please note that corrections may take a couple of weeks to filter through
the various RePEc services.