IDEAS home Printed from https://ideas.repec.org/h/spr/sprchp/978-3-030-34910-3_2.html
   My bibliography  Save this book chapter

Advances in Low-Memory Subgradient Optimization

In: Numerical Nonsmooth Optimization

Author

Listed:
  • Pavel E. Dvurechensky

    (Weierstrass Institute for Applied Analysis and Stochastic
    Institute for Information Transmission Problems RAS)

  • Alexander V. Gasnikov

    (Institute for Information Transmission Problems RAS
    Moscow Institute of Physics and Technology)

  • Evgeni A. Nurminski

    (Far Eastern Federal University)

  • Fedor S. Stonyakin

    (Moscow Institute of Physics and Technology
    V.I. Vernadsky Crimean Federal University)

Abstract

This chapter is devoted to the blackbox subgradient algorithms with the minimal requirements for the storage of auxiliary results, which are necessary to execute these algorithms. To provide historical perspective this survey starts with the original result of Shor which opened this field with the application to the classical transportation problem. The theoretical complexity bounds for smooth and nonsmooth convex and quasiconvex optimization problems are briefly exposed in what follows to introduce the relevant fundamentals of nonsmooth optimization. Special attention in this section is given to the adaptive step size policy which aims to attain lowest complexity bounds. Nondifferentiability of objective function in convex optimization significantly slows down the rate of convergence in subgradient optimization compared to the smooth case, but there are different modern techniques that allow to solve nonsmooth convex optimization problems faster than dictate theoretical lower complexity bounds. In this work the particular attention is given to Nesterov smoothing technique, Nesterov universal approach, and Legendre (saddle point) representation approach. The new results on universal mirror prox algorithms represent the original parts of the survey. To demonstrate application of nonsmooth convex optimization algorithms to solution of huge-scale extremal problems we consider convex optimization problems with nonsmooth functional constraints and propose two adaptive mirror descent methods. The first method is of primal-dual variety and proved to be optimal in terms of lower oracle bounds for the class of Lipschitz continuous convex objectives and constraints. The advantages of application of this method to the sparse truss topology design problem are discussed in essential details. The second method can be used for solution of convex and quasiconvex optimization problems and it is optimal in terms of complexity bounds. The conclusion part of the survey contains the important references that characterize recent developments of nonsmooth convex optimization.

Suggested Citation

  • Pavel E. Dvurechensky & Alexander V. Gasnikov & Evgeni A. Nurminski & Fedor S. Stonyakin, 2020. "Advances in Low-Memory Subgradient Optimization," Springer Books, in: Adil M. Bagirov & Manlio Gaudioso & Napsu Karmitsa & Marko M. Mäkelä & Sona Taheri (ed.), Numerical Nonsmooth Optimization, chapter 0, pages 19-59, Springer.
  • Handle: RePEc:spr:sprchp:978-3-030-34910-3_2
    DOI: 10.1007/978-3-030-34910-3_2
    as

    Download full text from publisher

    To our knowledge, this item is not available for download. To find whether it is available, there are three options:
    1. Check below whether another version of this item is available online.
    2. Check on the provider's web page whether it is in fact available.
    3. Perform a
    for a similarly titled item that would be available.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:sprchp:978-3-030-34910-3_2. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.