IDEAS home Printed from https://ideas.repec.org/a/gam/jmathe/v13y2025i13p2088-d1687106.html
   My bibliography  Save this article

On the Synergy of Optimizers and Activation Functions: A CNN Benchmarking Study

Author

Listed:
  • Khuraman Aziz Sayın

    (Department of Mathematics, Ege University, Bornova, Izmir 35040, Türkiye)

  • Necla Kırcalı Gürsoy

    (Department of Computer Programming, Ege University, Bornova, Izmir 35040, Türkiye)

  • Türkay Yolcu

    (Department of Mathematics, Bradley University, Peoria, IL 61625, USA)

  • Arif Gürsoy

    (Department of Mathematics, Ege University, Bornova, Izmir 35040, Türkiye)

Abstract

In this study, we present a comparative analysis of gradient descent-based optimizers frequently used in Convolutional Neural Networks (CNNs), including SGD, mSGD, RMSprop, Adadelta, Nadam, Adamax, Adam, and the recent EVE optimizer. To explore the interaction between optimization strategies and activation functions, we systematically evaluate all combinations of these optimizers with four activation functions—ReLU, LeakyReLU, Tanh, and GELU—across three benchmark image classification datasets: CIFAR-10, Fashion-MNIST (F-MNIST), and Labeled Faces in the Wild (LFW). Each configuration was assessed using multiple evaluation metrics, including accuracy, precision, recall, F1-score, mean absolute error (MAE), and mean squared error (MSE). All experiments were performed using k -fold cross-validation to ensure statistical robustness. Additionally, two-way ANOVA was employed to validate the significance of differences across optimizer–activation combinations. This study aims to highlight the importance of jointly selecting optimizers and activation functions to enhance training dynamics and generalization in CNNs. We also consider the role of critical hyperparameters, such as learning rate and regularization methods, in influencing optimization stability. This work provides valuable insights into the optimizer–activation interplay and offers practical guidance for improving architectural and hyperparameter configurations in CNN-based deep learning models.

Suggested Citation

  • Khuraman Aziz Sayın & Necla Kırcalı Gürsoy & Türkay Yolcu & Arif Gürsoy, 2025. "On the Synergy of Optimizers and Activation Functions: A CNN Benchmarking Study," Mathematics, MDPI, vol. 13(13), pages 1-36, June.
  • Handle: RePEc:gam:jmathe:v:13:y:2025:i:13:p:2088-:d:1687106
    as

    Download full text from publisher

    File URL: https://www.mdpi.com/2227-7390/13/13/2088/pdf
    Download Restriction: no

    File URL: https://www.mdpi.com/2227-7390/13/13/2088/
    Download Restriction: no
    ---><---

    More about this item

    Keywords

    ;
    ;
    ;
    ;
    ;

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:gam:jmathe:v:13:y:2025:i:13:p:2088-:d:1687106. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: MDPI Indexing Manager (email available below). General contact details of provider: https://www.mdpi.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.