IDEAS home Printed from https://ideas.repec.org/a/gam/jmathe/v11y2023i10p2264-d1145064.html
   My bibliography  Save this article

A Family of Multi-Step Subgradient Minimization Methods

Author

Listed:
  • Elena Tovbis

    (Institute of Informatics and Telecommunications, Reshetnev Siberian State University of Science and Technology, 31 Krasnoyarskii Rabochii Prospekt, Krasnoyarsk 660037, Russia)

  • Vladimir Krutikov

    (Department of Applied Mathematics, Kemerovo State University, 6 Krasnaya Street, Kemerovo 650043, Russia
    Faculty of Sciences and Mathematics, University of Nis, 18000 Nis, Serbia)

  • Predrag Stanimirović

    (Faculty of Sciences and Mathematics, University of Nis, 18000 Nis, Serbia
    Laboratory “Hybrid Methods of Modeling and Optimization in Complex Systems”, Siberian Federal University, 79 Svobodny Prospekt, Krasnoyarsk 660041, Russia)

  • Vladimir Meshechkin

    (Department of Applied Mathematics, Kemerovo State University, 6 Krasnaya Street, Kemerovo 650043, Russia)

  • Aleksey Popov

    (Institute of Informatics and Telecommunications, Reshetnev Siberian State University of Science and Technology, 31 Krasnoyarskii Rabochii Prospekt, Krasnoyarsk 660037, Russia)

  • Lev Kazakovtsev

    (Institute of Informatics and Telecommunications, Reshetnev Siberian State University of Science and Technology, 31 Krasnoyarskii Rabochii Prospekt, Krasnoyarsk 660037, Russia
    Faculty of Sciences and Mathematics, University of Nis, 18000 Nis, Serbia)

Abstract

For solving non-smooth multidimensional optimization problems, we present a family of relaxation subgradient methods (RSMs) with a built-in algorithm for finding the descent direction that forms an acute angle with all subgradients in the neighborhood of the current minimum. Minimizing the function along the opposite direction (with a minus sign) enables the algorithm to go beyond the neighborhood of the current minimum. The family of algorithms for finding the descent direction is based on solving systems of inequalities. The finite convergence of the algorithms on separable bounded sets is proved. Algorithms for solving systems of inequalities are used to organize the RSM family. On quadratic functions, the methods of the RSM family are equivalent to the conjugate gradient method (CGM). The methods are intended for solving high-dimensional problems and are studied theoretically and numerically. Examples of solving convex and non-convex smooth and non-smooth problems of large dimensions are given.

Suggested Citation

  • Elena Tovbis & Vladimir Krutikov & Predrag Stanimirović & Vladimir Meshechkin & Aleksey Popov & Lev Kazakovtsev, 2023. "A Family of Multi-Step Subgradient Minimization Methods," Mathematics, MDPI, vol. 11(10), pages 1-24, May.
  • Handle: RePEc:gam:jmathe:v:11:y:2023:i:10:p:2264-:d:1145064
    as

    Download full text from publisher

    File URL: https://www.mdpi.com/2227-7390/11/10/2264/pdf
    Download Restriction: no

    File URL: https://www.mdpi.com/2227-7390/11/10/2264/
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. Igor Konnov, 2020. "A Non-monotone Conjugate Subgradient Type Method for Minimization of Convex Functions," Journal of Optimization Theory and Applications, Springer, vol. 184(2), pages 534-546, February.
    2. Yutao Zheng & Bing Zheng, 2017. "Two New Dai–Liao-Type Conjugate Gradient Methods for Unconstrained Optimization Problems," Journal of Optimization Theory and Applications, Springer, vol. 175(2), pages 502-509, November.
    3. M. V. Solodov & S. K. Zavriev, 1998. "Error Stability Properties of Generalized Gradient-Type Algorithms," Journal of Optimization Theory and Applications, Springer, vol. 98(3), pages 663-680, September.
    4. Junyu Lu & Yong Li & Hongtruong Pham, 2020. "A Modified Dai–Liao Conjugate Gradient Method with a New Parameter for Solving Image Restoration Problems," Mathematical Problems in Engineering, Hindawi, vol. 2020, pages 1-13, September.
    5. Auwal Bala Abubakar & Poom Kumam & Hassan Mohammad & Aliyu Muhammed Awwal & Kanokwan Sitthithakerngkiet, 2019. "A Modified Fletcher–Reeves Conjugate Gradient Method for Monotone Nonlinear Equations with Some Applications," Mathematics, MDPI, vol. 7(8), pages 1-25, August.
    6. NESTEROV, Yurii, 2015. "Universal gradient methods for convex optimization problems," LIDAM Reprints CORE 2701, Université catholique de Louvain, Center for Operations Research and Econometrics (CORE).
    7. Y.H. Dai & Y. Yuan, 2001. "An Efficient Hybrid Conjugate Gradient Method for Unconstrained Optimization," Annals of Operations Research, Springer, vol. 103(1), pages 33-47, March.
    8. Vladimir Krutikov & Svetlana Gutova & Elena Tovbis & Lev Kazakovtsev & Eugene Semenkin, 2022. "Relaxation Subgradient Algorithms with Machine Learning Procedures," Mathematics, MDPI, vol. 10(21), pages 1-33, October.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Predrag S. Stanimirović & Branislav Ivanov & Snežana Djordjević & Ivona Brajević, 2018. "New Hybrid Conjugate Gradient and Broyden–Fletcher–Goldfarb–Shanno Conjugate Gradient Methods," Journal of Optimization Theory and Applications, Springer, vol. 178(3), pages 860-884, September.
    2. Regina S. Burachik & Yaohua Hu & Xiaoqi Yang, 2022. "Interior quasi-subgradient method with non-Euclidean distances for constrained quasi-convex optimization problems in hilbert spaces," Journal of Global Optimization, Springer, vol. 83(2), pages 249-271, June.
    3. Larsson, Torbjorn & Patriksson, Michael & Stromberg, Ann-Brith, 2003. "On the convergence of conditional [var epsilon]-subgradient methods for convex programs and convex-concave saddle-point problems," European Journal of Operational Research, Elsevier, vol. 151(3), pages 461-473, December.
    4. Hiroyuki Sakai & Hideaki Iiduka, 2020. "Hybrid Riemannian conjugate gradient methods with global convergence properties," Computational Optimization and Applications, Springer, vol. 77(3), pages 811-830, December.
    5. Masaru Ito, 2016. "New results on subgradient methods for strongly convex optimization problems with a unified analysis," Computational Optimization and Applications, Springer, vol. 65(1), pages 127-172, September.
    6. Masoud Ahookhosh, 2019. "Accelerated first-order methods for large-scale convex optimization: nearly optimal complexity under strong convexity," Mathematical Methods of Operations Research, Springer;Gesellschaft für Operations Research (GOR);Nederlands Genootschap voor Besliskunde (NGB), vol. 89(3), pages 319-353, June.
    7. Abubakar, Auwal Bala & Kumam, Poom & Ibrahim, Abdulkarim Hassan & Chaipunya, Parin & Rano, Sadiya Ali, 2022. "New hybrid three-term spectral-conjugate gradient method for finding solutions of nonlinear monotone operator equations with applications," Mathematics and Computers in Simulation (MATCOM), Elsevier, vol. 201(C), pages 670-683.
    8. Serge Gratton & Vincent Malmedy & Philippe Toint, 2012. "Using approximate secant equations in limited memory methods for multilevel unconstrained optimization," Computational Optimization and Applications, Springer, vol. 51(3), pages 967-979, April.
    9. Peng Zhang & Gejun Bao, 2018. "An Incremental Subgradient Method on Riemannian Manifolds," Journal of Optimization Theory and Applications, Springer, vol. 176(3), pages 711-727, March.
    10. Kin Keung Lai & Shashi Kant Mishra & Bhagwat Ram & Ravina Sharma, 2023. "A Conjugate Gradient Method: Quantum Spectral Polak–Ribiére–Polyak Approach for Unconstrained Optimization Problems," Mathematics, MDPI, vol. 11(23), pages 1-14, December.
    11. Masoud Ahookhosh & Le Thi Khanh Hien & Nicolas Gillis & Panagiotis Patrinos, 2021. "A Block Inertial Bregman Proximal Algorithm for Nonsmooth Nonconvex Problems with Application to Symmetric Nonnegative Matrix Tri-Factorization," Journal of Optimization Theory and Applications, Springer, vol. 190(1), pages 234-258, July.
    12. Priester, C. Robert & Melbourne-Thomas, Jessica & Klocker, Andreas & Corney, Stuart, 2017. "Abrupt transitions in dynamics of a NPZD model across Southern Ocean fronts," Ecological Modelling, Elsevier, vol. 359(C), pages 372-382.
    13. Ahmad M. Alshamrani & Adel Fahad Alrasheedi & Khalid Abdulaziz Alnowibet & Salem Mahdi & Ali Wagdy Mohamed, 2022. "A Hybrid Stochastic Deterministic Algorithm for Solving Unconstrained Optimization Problems," Mathematics, MDPI, vol. 10(17), pages 1-26, August.
    14. Xiaoliang Wang & Liping Pang & Qi Wu & Mingkun Zhang, 2021. "An Adaptive Proximal Bundle Method with Inexact Oracles for a Class of Nonconvex and Nonsmooth Composite Optimization," Mathematics, MDPI, vol. 9(8), pages 1-27, April.
    15. Qi Tian & Xiaoliang Wang & Liping Pang & Mingkun Zhang & Fanyun Meng, 2021. "A New Hybrid Three-Term Conjugate Gradient Algorithm for Large-Scale Unconstrained Problems," Mathematics, MDPI, vol. 9(12), pages 1-13, June.
    16. Fedor Stonyakin & Alexander Gasnikov & Pavel Dvurechensky & Alexander Titov & Mohammad Alkousa, 2022. "Generalized Mirror Prox Algorithm for Monotone Variational Inequalities: Universality and Inexact Oracle," Journal of Optimization Theory and Applications, Springer, vol. 194(3), pages 988-1013, September.
    17. Jinpeng Ma & Qiongling Li, 2016. "Convergence of price processes under two dynamic double auctions," The Journal of Mechanism and Institution Design, Society for the Promotion of Mechanism and Institution Design, University of York, vol. 1(1), pages 1-44, December.
    18. Fedor Stonyakin & Ilya Kuruzov & Boris Polyak, 2023. "Stopping Rules for Gradient Methods for Non-convex Problems with Additive Noise in Gradient," Journal of Optimization Theory and Applications, Springer, vol. 198(2), pages 531-551, August.
    19. B. Sellami & Y. Chaib, 2016. "A new family of globally convergent conjugate gradient methods," Annals of Operations Research, Springer, vol. 241(1), pages 497-513, June.
    20. Benjamin Grimmer, 2023. "General Hölder Smooth Convergence Rates Follow from Specialized Rates Assuming Growth Bounds," Journal of Optimization Theory and Applications, Springer, vol. 197(1), pages 51-70, April.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:gam:jmathe:v:11:y:2023:i:10:p:2264-:d:1145064. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: MDPI Indexing Manager (email available below). General contact details of provider: https://www.mdpi.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.