IDEAS home Printed from https://ideas.repec.org/a/gam/jmathe/v11y2023i1p217-d1022114.html
   My bibliography  Save this article

Finding the Optimal Topology of an Approximating Neural Network

Author

Listed:
  • Kostadin Yotov

    (Faculty of Mathematics and Informatics, University of Plovdiv Paisii Hilendarski, 236 Bulgaria Blvd., 4027 Plovdiv, Bulgaria)

  • Emil Hadzhikolev

    (Faculty of Mathematics and Informatics, University of Plovdiv Paisii Hilendarski, 236 Bulgaria Blvd., 4027 Plovdiv, Bulgaria)

  • Stanka Hadzhikoleva

    (Faculty of Mathematics and Informatics, University of Plovdiv Paisii Hilendarski, 236 Bulgaria Blvd., 4027 Plovdiv, Bulgaria)

  • Stoyan Cheresharov

    (Faculty of Mathematics and Informatics, University of Plovdiv Paisii Hilendarski, 236 Bulgaria Blvd., 4027 Plovdiv, Bulgaria)

Abstract

A large number of researchers spend a lot of time searching for the most efficient neural network to solve a given problem. The procedure of configuration, training, testing, and comparison for expected performance is applied to each experimental neural network. The configuration parameters—training methods, transfer functions, number of hidden layers, number of neurons, number of epochs, and tolerable error—have multiple possible values. Setting guidelines for appropriate parameter values would shorten the time required to create an efficient neural network, facilitate researchers, and provide a tool to improve the performance of automated neural network search methods. The task considered in this paper is related to the determination of upper bounds for the number of hidden layers and the number of neurons in them for approximating artificial neural networks trained with algorithms using the Jacobi matrix in the error function. The derived formulas for the upper limits of the number of hidden layers and the number of neurons in them are proved theoretically, and the presented experiments confirm their validity. They show that the search for an efficient neural network can focus below certain upper bounds, and above them, it becomes pointless. The formulas provide researchers with a useful auxiliary tool in the search for efficient neural networks with optimal topology. They are applicable to neural networks trained with methods such as Levenberg–Marquardt, Gauss–Newton, Bayesian regularization, scaled conjugate gradient, BFGS quasi-Newton, etc., which use the Jacobi matrix.

Suggested Citation

  • Kostadin Yotov & Emil Hadzhikolev & Stanka Hadzhikoleva & Stoyan Cheresharov, 2023. "Finding the Optimal Topology of an Approximating Neural Network," Mathematics, MDPI, vol. 11(1), pages 1-18, January.
  • Handle: RePEc:gam:jmathe:v:11:y:2023:i:1:p:217-:d:1022114
    as

    Download full text from publisher

    File URL: https://www.mdpi.com/2227-7390/11/1/217/pdf
    Download Restriction: no

    File URL: https://www.mdpi.com/2227-7390/11/1/217/
    Download Restriction: no
    ---><---

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:gam:jmathe:v:11:y:2023:i:1:p:217-:d:1022114. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: MDPI Indexing Manager (email available below). General contact details of provider: https://www.mdpi.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.