IDEAS home Printed from https://ideas.repec.org/a/spr/joptap/v193y2022i1d10.1007_s10957-022-02014-1.html
   My bibliography  Save this article

Convergence of Inexact Quasisubgradient Methods with Extrapolation

Author

Listed:
  • Xiaoqi Yang

    (The Hong Kong Polytechnic University)

  • Chenchen Zu

    (The Hong Kong Polytechnic University)

Abstract

In this paper, we investigate an inexact quasisubgradient method with extrapolation for solving a quasiconvex optimization problem with a closed, convex and bounded constraint set. We establish the convergence in objective values, iteration complexity and rate of convergence for our proposed method under Hölder condition and weak sharp minima condition. When both diminishing stepsize and extrapolation stepsize are decaying as a power function, we obtain explicit iteration complexities. When diminishing stepsize is decaying as a power function and the extrapolation stepsize is decreasing not less than a power function, the diminishing stepsize provides a rate of convergence $${\mathcal {O}}\left( \tau ^{k^{s}}\right) (s \in (0,1))$$ O τ k s ( s ∈ ( 0 , 1 ) ) to an optimal solution or to a ball of the optimal solution set, which is faster than $${\mathcal {O}}\left( {1}/{k^\beta }\right) $$ O 1 / k β (for each $$\beta >0$$ β > 0 ). With geometrically decreasing extrapolation stepsize, we obtain a linear rate of convergence to a ball of the optimal solution set for the constant stepsize and dynamic stepsize. Our numerical testing shows that the performance with extrapolation is much more efficient than that without extrapolation in terms of the number of iterations needed for reaching an approximate optimal solution.

Suggested Citation

  • Xiaoqi Yang & Chenchen Zu, 2022. "Convergence of Inexact Quasisubgradient Methods with Extrapolation," Journal of Optimization Theory and Applications, Springer, vol. 193(1), pages 676-703, June.
  • Handle: RePEc:spr:joptap:v:193:y:2022:i:1:d:10.1007_s10957-022-02014-1
    DOI: 10.1007/s10957-022-02014-1
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s10957-022-02014-1
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s10957-022-02014-1?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Stephen P. Bradley & Sherwood C. Frey, 1974. "Fractional Programming with Homogeneous Functions," Operations Research, INFORMS, vol. 22(2), pages 350-357, April.
    2. Xiaoqiang Cai & Kok-Lay Teo & Xiaoqi Yang & Xun Yu Zhou, 2000. "Portfolio Optimization Under a Minimax Rule," Management Science, INFORMS, vol. 46(7), pages 957-972, July.
    3. Yaohua Hu & Jiawen Li & Carisa Kwok Wai Yu, 2020. "Convergence rates of subgradient methods for quasi-convex optimization problems," Computational Optimization and Applications, Springer, vol. 77(1), pages 183-212, September.
    4. Zhongming Wu & Min Li, 2019. "General inertial proximal gradient method for a class of nonconvex nonsmooth optimization problems," Computational Optimization and Applications, Springer, vol. 73(1), pages 129-158, May.
    5. M. Marques Alves & Jonathan Eckstein & Marina Geremia & Jefferson G. Melo, 2020. "Relative-error inertial-relaxed inexact versions of Douglas-Rachford and ADMM splitting algorithms," Computational Optimization and Applications, Springer, vol. 75(2), pages 389-422, March.
    6. A. Auslender & M. Teboulle, 2004. "Interior Gradient and Epsilon-Subgradient Descent Methods for Constrained Convex Minimization," Mathematics of Operations Research, INFORMS, vol. 29(1), pages 1-26, February.
    7. X. X. Huang & X. Q. Yang, 2003. "A Unified Augmented Lagrangian Approach to Duality and Exact Penalization," Mathematics of Operations Research, INFORMS, vol. 28(3), pages 533-552, August.
    8. Boţ, Radu Ioan & Csetnek, Ernö Robert & Hendrich, Christopher, 2015. "Inertial Douglas–Rachford splitting for monotone inclusion problems," Applied Mathematics and Computation, Elsevier, vol. 256(C), pages 472-487.
    9. Nils Langenberg & Rainer Tichatschke, 2012. "Interior proximal methods for quasiconvex optimization," Journal of Global Optimization, Springer, vol. 52(3), pages 641-661, March.
    10. Patrick R. Johnstone & Pierre Moulin, 2017. "Local and global convergence of a general inertial proximal splitting scheme for minimizing composite functions," Computational Optimization and Applications, Springer, vol. 67(2), pages 259-292, June.
    11. NESTEROV, Yu., 2005. "Smooth minimization of non-smooth functions," LIDAM Reprints CORE 1819, Université catholique de Louvain, Center for Operations Research and Econometrics (CORE).
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Hu, Yaohua & Li, Gongnong & Yu, Carisa Kwok Wai & Yip, Tsz Leung, 2022. "Quasi-convex feasibility problems: Subgradient methods and convergence rates," European Journal of Operational Research, Elsevier, vol. 298(1), pages 45-58.
    2. Szilárd Csaba László, 2023. "A Forward–Backward Algorithm With Different Inertial Terms for Structured Non-Convex Minimization Problems," Journal of Optimization Theory and Applications, Springer, vol. 198(1), pages 387-427, July.
    3. Yaohua Hu & Carisa Kwok Wai Yu & Xiaoqi Yang, 2019. "Incremental quasi-subgradient methods for minimizing the sum of quasi-convex functions," Journal of Global Optimization, Springer, vol. 75(4), pages 1003-1028, December.
    4. Zhongming Wu & Min Li, 2019. "General inertial proximal gradient method for a class of nonconvex nonsmooth optimization problems," Computational Optimization and Applications, Springer, vol. 73(1), pages 129-158, May.
    5. E. M. Bednarczuk & A. Jezierska & K. E. Rutkowski, 2018. "Proximal primal–dual best approximation algorithm with memory," Computational Optimization and Applications, Springer, vol. 71(3), pages 767-794, December.
    6. Jamilu Abubakar & Poom Kumam & Abdulkarim Hassan Ibrahim & Anantachai Padcharoen, 2020. "Relaxed Inertial Tseng’s Type Method for Solving the Inclusion Problem with Application to Image Restoration," Mathematics, MDPI, vol. 8(5), pages 1-19, May.
    7. Regina S. Burachik & Yaohua Hu & Xiaoqi Yang, 2022. "Interior quasi-subgradient method with non-Euclidean distances for constrained quasi-convex optimization problems in hilbert spaces," Journal of Global Optimization, Springer, vol. 83(2), pages 249-271, June.
    8. Hu, Yaohua & Yang, Xiaoqi & Sim, Chee-Khian, 2015. "Inexact subgradient methods for quasi-convex optimization problems," European Journal of Operational Research, Elsevier, vol. 240(2), pages 315-327.
    9. Zhongming Wu & Chongshou Li & Min Li & Andrew Lim, 2021. "Inertial proximal gradient methods with Bregman regularization for a class of nonconvex optimization problems," Journal of Global Optimization, Springer, vol. 79(3), pages 617-644, March.
    10. Pawicha Phairatchatniyom & Poom Kumam & Yeol Je Cho & Wachirapong Jirakitpuwapat & Kanokwan Sitthithakerngkiet, 2019. "The Modified Inertial Iterative Algorithm for Solving Split Variational Inclusion Problem for Multi-Valued Quasi Nonexpansive Mappings with Some Applications," Mathematics, MDPI, vol. 7(6), pages 1-22, June.
    11. Chinedu Izuchukwu & Yekini Shehu, 2021. "New Inertial Projection Methods for Solving Multivalued Variational Inequality Problems Beyond Monotonicity," Networks and Spatial Economics, Springer, vol. 21(2), pages 291-323, June.
    12. Q. L. Dong & J. Z. Huang & X. H. Li & Y. J. Cho & Th. M. Rassias, 2019. "MiKM: multi-step inertial Krasnosel’skiǐ–Mann algorithm and its applications," Journal of Global Optimization, Springer, vol. 73(4), pages 801-824, April.
    13. Masaru Ito, 2016. "New results on subgradient methods for strongly convex optimization problems with a unified analysis," Computational Optimization and Applications, Springer, vol. 65(1), pages 127-172, September.
    14. TAYLOR, Adrien B. & HENDRICKX, Julien M. & François GLINEUR, 2016. "Exact worst-case performance of first-order methods for composite convex optimization," LIDAM Discussion Papers CORE 2016052, Université catholique de Louvain, Center for Operations Research and Econometrics (CORE).
    15. Dimitris Bertsimas & Nishanth Mundru, 2021. "Sparse Convex Regression," INFORMS Journal on Computing, INFORMS, vol. 33(1), pages 262-279, January.
    16. Alexandre Belloni & Victor Chernozhukov & Lie Wang, 2013. "Pivotal estimation via square-root lasso in nonparametric regression," CeMMAP working papers CWP62/13, Centre for Microdata Methods and Practice, Institute for Fiscal Studies.
    17. Kaiwen Meng & Xiaoqi Yang, 2015. "First- and Second-Order Necessary Conditions Via Exact Penalty Functions," Journal of Optimization Theory and Applications, Springer, vol. 165(3), pages 720-752, June.
    18. Y. Y. Zhou & X. Q. Yang, 2009. "Duality and Penalization in Optimization via an Augmented Lagrangian Function with Applications," Journal of Optimization Theory and Applications, Springer, vol. 140(1), pages 171-188, January.
    19. DEVOLDER, Olivier & GLINEUR, François & NESTEROV, Yurii, 2013. "First-order methods with inexact oracle: the strongly convex case," LIDAM Discussion Papers CORE 2013016, Université catholique de Louvain, Center for Operations Research and Econometrics (CORE).
    20. David Degras, 2021. "Sparse group fused lasso for model segmentation: a hybrid approach," Advances in Data Analysis and Classification, Springer;German Classification Society - Gesellschaft für Klassifikation (GfKl);Japanese Classification Society (JCS);Classification and Data Analysis Group of the Italian Statistical Society (CLADAG);International Federation of Classification Societies (IFCS), vol. 15(3), pages 625-671, September.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:joptap:v:193:y:2022:i:1:d:10.1007_s10957-022-02014-1. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.