IDEAS home Printed from https://ideas.repec.org/a/gam/jmathe/v11y2023i11p2466-d1157067.html
   My bibliography  Save this article

Survey of Optimization Algorithms in Modern Neural Networks

Author

Listed:
  • Ruslan Abdulkadirov

    (North-Caucasus Center for Mathematical Research, North-Caucasus Federal University, 355009 Stavropol, Russia)

  • Pavel Lyakhov

    (North-Caucasus Center for Mathematical Research, North-Caucasus Federal University, 355009 Stavropol, Russia
    Department of Mathematical Modeling, North-Caucasus Federal University, 355009 Stavropol, Russia)

  • Nikolay Nagornov

    (Department of Mathematical Modeling, North-Caucasus Federal University, 355009 Stavropol, Russia)

Abstract

The main goal of machine learning is the creation of self-learning algorithms in many areas of human activity. It allows a replacement of a person with artificial intelligence in seeking to expand production. The theory of artificial neural networks, which have already replaced humans in many problems, remains the most well-utilized branch of machine learning. Thus, one must select appropriate neural network architectures, data processing, and advanced applied mathematics tools. A common challenge for these networks is achieving the highest accuracy in a short time. This problem is solved by modifying networks and improving data pre-processing, where accuracy increases along with training time. Bt using optimization methods, one can improve the accuracy without increasing the time. In this review, we consider all existing optimization algorithms that meet in neural networks. We present modifications of optimization algorithms of the first, second, and information-geometric order, which are related to information geometry for Fisher–Rao and Bregman metrics. These optimizers have significantly influenced the development of neural networks through geometric and probabilistic tools. We present applications of all the given optimization algorithms, considering the types of neural networks. After that, we show ways to develop optimization algorithms in further research using modern neural networks. Fractional order, bilevel, and gradient-free optimizers can replace classical gradient-based optimizers. Such approaches are induced in graph, spiking, complex-valued, quantum, and wavelet neural networks. Besides pattern recognition, time series prediction, and object detection, there are many other applications in machine learning: quantum computations, partial differential, and integrodifferential equations, and stochastic processes.

Suggested Citation

  • Ruslan Abdulkadirov & Pavel Lyakhov & Nikolay Nagornov, 2023. "Survey of Optimization Algorithms in Modern Neural Networks," Mathematics, MDPI, vol. 11(11), pages 1-37, May.
  • Handle: RePEc:gam:jmathe:v:11:y:2023:i:11:p:2466-:d:1157067
    as

    Download full text from publisher

    File URL: https://www.mdpi.com/2227-7390/11/11/2466/pdf
    Download Restriction: no

    File URL: https://www.mdpi.com/2227-7390/11/11/2466/
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. Kim, Tae-Young & Cho, Sung-Bae, 2019. "Predicting residential energy consumption using CNN-LSTM neural networks," Energy, Elsevier, vol. 182(C), pages 72-81.
    2. Ran Li & Hui Sun & Xue Wei & Weiwen Ta & Haiying Wang, 2022. "Lithium Battery State-of-Charge Estimation Based on AdaBoost.Rt-RNN," Energies, MDPI, vol. 15(16), pages 1-15, August.
    3. Sexton, Randall S. & Dorsey, Robert E. & Johnson, John D., 1999. "Optimization of neural networks: A comparative analysis of the genetic algorithm and simulated annealing," European Journal of Operational Research, Elsevier, vol. 114(3), pages 589-601, May.
    4. Deren Han & Xiaoming Yuan, 2012. "A Note on the Alternating Direction Method of Multipliers," Journal of Optimization Theory and Applications, Springer, vol. 155(1), pages 227-238, October.
    5. Roberto Garrappa & Eva Kaslik & Marina Popolizio, 2019. "Evaluation of Fractional Integrals and Derivatives of Elementary Functions: Overview and Tutorial," Mathematics, MDPI, vol. 7(5), pages 1-21, May.
    6. Ruslan Abdulkadirov & Pavel Lyakhov & Nikolay Nagornov, 2022. "Accelerating Extreme Search of Multidimensional Functions Based on Natural Gradient Descent with Dirichlet Distributions," Mathematics, MDPI, vol. 10(19), pages 1-13, September.
    7. Kathiresan Shankar & Sachin Kumar & Ashit Kumar Dutta & Ahmed Alkhayyat & Anwar Ja’afar Mohamad Jawad & Ali Hashim Abbas & Yousif K. Yousif, 2022. "An Automated Hyperparameter Tuning Recurrent Neural Network Model for Fruit Classification," Mathematics, MDPI, vol. 10(13), pages 1-18, July.
    8. Likas, Aristidis & Stafylopatis, Andreas, 2000. "Training the random neural network using quasi-Newton methods," European Journal of Operational Research, Elsevier, vol. 126(2), pages 331-339, October.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Nannan Xu & Xinze Cui & Xin Wang & Wei Zhang & Tianyu Zhao, 2022. "An Intelligent Athlete Signal Processing Methodology for Balance Control Ability Assessment with Multi-Headed Self-Attention Mechanism," Mathematics, MDPI, vol. 10(15), pages 1-16, August.
    2. Maryam Yashtini, 2022. "Convergence and rate analysis of a proximal linearized ADMM for nonconvex nonsmooth optimization," Journal of Global Optimization, Springer, vol. 84(4), pages 913-939, December.
    3. Puya Latafat & Panagiotis Patrinos, 2017. "Asymmetric forward–backward–adjoint splitting for solving monotone inclusions involving three operators," Computational Optimization and Applications, Springer, vol. 68(1), pages 57-93, September.
    4. Abubakar Ahmad Musa & Adamu Hussaini & Weixian Liao & Fan Liang & Wei Yu, 2023. "Deep Neural Networks for Spatial-Temporal Cyber-Physical Systems: A Survey," Future Internet, MDPI, vol. 15(6), pages 1-24, May.
    5. Lan, Puzhe & Han, Dong & Xu, Xiaoyuan & Yan, Zheng & Ren, Xijun & Xia, Shiwei, 2022. "Data-driven state estimation of integrated electric-gas energy system," Energy, Elsevier, vol. 252(C).
    6. Ijaz Ul Haq & Amin Ullah & Samee Ullah Khan & Noman Khan & Mi Young Lee & Seungmin Rho & Sung Wook Baik, 2021. "Sequential Learning-Based Energy Consumption Prediction Model for Residential and Commercial Sectors," Mathematics, MDPI, vol. 9(6), pages 1-17, March.
    7. Lu, Yakai & Tian, Zhe & Zhou, Ruoyu & Liu, Wenjing, 2021. "A general transfer learning-based framework for thermal load prediction in regional energy system," Energy, Elsevier, vol. 217(C).
    8. Sun, Hongchang & Niu, Yanlei & Li, Chengdong & Zhou, Changgeng & Zhai, Wenwen & Chen, Zhe & Wu, Hao & Niu, Lanqiang, 2022. "Energy consumption optimization of building air conditioning system via combining the parallel temporal convolutional neural network and adaptive opposition-learning chimp algorithm," Energy, Elsevier, vol. 259(C).
    9. Luo, X.J. & Oyedele, Lukumon O. & Ajayi, Anuoluwapo O. & Akinade, Olugbenga O. & Owolabi, Hakeem A. & Ahmed, Ashraf, 2020. "Feature extraction and genetic algorithm enhanced adaptive deep neural network for energy consumption prediction in buildings," Renewable and Sustainable Energy Reviews, Elsevier, vol. 131(C).
    10. Namrye Son, 2021. "Comparison of the Deep Learning Performance for Short-Term Power Load Forecasting," Sustainability, MDPI, vol. 13(22), pages 1-25, November.
    11. William W. Hager & Hongchao Zhang, 2020. "Convergence rates for an inexact ADMM applied to separable convex optimization," Computational Optimization and Applications, Springer, vol. 77(3), pages 729-754, December.
    12. Wu, Han & Liang, Yan & Heng, Jiani, 2023. "Pulse-diagnosis-inspired multi-feature extraction deep network for short-term electricity load forecasting," Applied Energy, Elsevier, vol. 339(C).
    13. Zeng, Huibin & Shao, Bilin & Dai, Hongbin & Yan, Yichuan & Tian, Ning, 2023. "Prediction of fluctuation loads based on GARCH family-CatBoost-CNNLSTM," Energy, Elsevier, vol. 263(PE).
    14. Jinyuan Liu & Shouxi Wang & Nan Wei & Yi Yang & Yihao Lv & Xu Wang & Fanhua Zeng, 2023. "An Enhancement Method Based on Long Short-Term Memory Neural Network for Short-Term Natural Gas Consumption Forecasting," Energies, MDPI, vol. 16(3), pages 1-14, January.
    15. Geraint Johnes, 2000. "Up Around the Bend: Linear and nonlinear models of the UK economy compared," International Review of Applied Economics, Taylor & Francis Journals, vol. 14(4), pages 485-493.
    16. Zizhen Cheng & Li Wang & Yumeng Yang, 2023. "A Hybrid Feature Pyramid CNN-LSTM Model with Seasonal Inflection Month Correction for Medium- and Long-Term Power Load Forecasting," Energies, MDPI, vol. 16(7), pages 1-18, March.
    17. Hyunsoo Kim & Jiseok Jeong & Changwan Kim, 2022. "Daily Peak-Electricity-Demand Forecasting Based on Residual Long Short-Term Network," Mathematics, MDPI, vol. 10(23), pages 1-17, November.
    18. Pendharkar, Parag C., 2002. "A computational study on the performance of artificial neural networks under changing structural design and data distribution," European Journal of Operational Research, Elsevier, vol. 138(1), pages 155-177, April.
    19. Li, Penghua & Zhang, Zijian & Grosu, Radu & Deng, Zhongwei & Hou, Jie & Rong, Yujun & Wu, Rui, 2022. "An end-to-end neural network framework for state-of-health estimation and remaining useful life prediction of electric vehicle lithium batteries," Renewable and Sustainable Energy Reviews, Elsevier, vol. 156(C).
    20. Joo, Rocío & Bertrand, Sophie & Chaigneau, Alexis & Ñiquen, Miguel, 2011. "Optimization of an artificial neural network for identifying fishing set positions from VMS data: An example from the Peruvian anchovy purse seine fishery," Ecological Modelling, Elsevier, vol. 222(4), pages 1048-1059.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:gam:jmathe:v:11:y:2023:i:11:p:2466-:d:1157067. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: MDPI Indexing Manager (email available below). General contact details of provider: https://www.mdpi.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.