IDEAS home Printed from https://ideas.repec.org/a/spr/coopap/v89y2024i3d10.1007_s10589-024-00607-2.html
   My bibliography  Save this article

Nonsmooth projection-free optimization with functional constraints

Author

Listed:
  • Kamiar Asgari

    (University of Southern California)

  • Michael J. Neely

    (University of Southern California)

Abstract

This paper presents a subgradient-based algorithm for constrained nonsmooth convex optimization that does not require projections onto the feasible set. While the well-established Frank–Wolfe algorithm and its variants already avoid projections, they are primarily designed for smooth objective functions. In contrast, our proposed algorithm can handle nonsmooth problems with general convex functional inequality constraints. It achieves an $$\epsilon $$ ϵ -suboptimal solution in $$\mathcal {O}(\epsilon ^{-2})$$ O ( ϵ - 2 ) iterations, with each iteration requiring only a single (potentially inexact) Linear Minimization Oracle call and a (possibly inexact) subgradient computation. This performance is consistent with existing lower bounds. Similar performance is observed when deterministic subgradients are replaced with stochastic subgradients. In the special case where there are no functional inequality constraints, our algorithm competes favorably with a recent nonsmooth projection-free method designed for constraint-free problems. Our approach utilizes a simple separation scheme in conjunction with a new Lagrange multiplier update rule.

Suggested Citation

  • Kamiar Asgari & Michael J. Neely, 2024. "Nonsmooth projection-free optimization with functional constraints," Computational Optimization and Applications, Springer, vol. 89(3), pages 927-975, December.
  • Handle: RePEc:spr:coopap:v:89:y:2024:i:3:d:10.1007_s10589-024-00607-2
    DOI: 10.1007/s10589-024-00607-2
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s10589-024-00607-2
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s10589-024-00607-2?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Yurii Nesterov, 2018. "Complexity bounds for primal-dual methods minimizing the model of objective function," LIDAM Reprints CORE 2992, Université catholique de Louvain, Center for Operations Research and Econometrics (CORE).
    2. Lemaréchal, C. & Nemirovskii, A. & Nesterov, Y., 1995. "New variants of bundle methods," LIDAM Reprints CORE 1166, Université catholique de Louvain, Center for Operations Research and Econometrics (CORE).
    3. Kun Chen & Hongbo Dong & Kung-Sik Chan, 2013. "Reduced rank regression via adaptive nuclear norm penalization," Biometrika, Biometrika Trust, vol. 100(4), pages 901-920.
    4. Guanghui Lan & Zhiqiang Zhou, 2020. "Algorithms for stochastic optimization with function or expectation constraints," Computational Optimization and Applications, Springer, vol. 76(2), pages 461-498, June.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Nikhil Devanathan & Stephen Boyd, 2024. "Polyak Minorant Method for Convex Optimization," Journal of Optimization Theory and Applications, Springer, vol. 203(3), pages 2263-2282, December.
    2. Chen, Canyi & Xu, Wangli & Zhu, Liping, 2022. "Distributed estimation in heterogeneous reduced rank regression: With application to order determination in sufficient dimension reduction," Journal of Multivariate Analysis, Elsevier, vol. 190(C).
    3. S. Yaser Samadi & Wiranthe B. Herath, 2023. "Reduced-rank Envelope Vector Autoregressive Models," Papers 2309.12902, arXiv.org.
    4. Z. R. Gabidullina, 2019. "Adaptive Conditional Gradient Method," Journal of Optimization Theory and Applications, Springer, vol. 183(3), pages 1077-1098, December.
    5. Luo, Chongliang & Liang, Jian & Li, Gen & Wang, Fei & Zhang, Changshui & Dey, Dipak K. & Chen, Kun, 2018. "Leveraging mixed and incomplete outcomes via reduced-rank modeling," Journal of Multivariate Analysis, Elsevier, vol. 167(C), pages 378-394.
    6. Feng, Sanying & Lian, Heng & Zhu, Fukang, 2016. "Reduced rank regression with possibly non-smooth criterion functions: An empirical likelihood approach," Computational Statistics & Data Analysis, Elsevier, vol. 103(C), pages 139-150.
    7. Manlio Gaudioso & Giovanni Giallombardo & Giovanna Miglionico, 2020. "Essentials of numerical nonsmooth optimization," 4OR, Springer, vol. 18(1), pages 1-47, March.
    8. Wim van Ackooij & Welington de Oliveira & Yongjia Song, 2018. "Adaptive Partition-Based Level Decomposition Methods for Solving Two-Stage Stochastic Programs with Fixed Recourse," INFORMS Journal on Computing, INFORMS, vol. 30(1), pages 57-70, February.
    9. Masoud Ahookhosh, 2019. "Accelerated first-order methods for large-scale convex optimization: nearly optimal complexity under strong convexity," Mathematical Methods of Operations Research, Springer;Gesellschaft für Operations Research (GOR);Nederlands Genootschap voor Besliskunde (NGB), vol. 89(3), pages 319-353, June.
    10. Welington Oliveira, 2019. "Proximal bundle methods for nonsmooth DC programming," Journal of Global Optimization, Springer, vol. 75(2), pages 523-563, October.
    11. Mishra, Aditya & Dey, Dipak K. & Chen, Yong & Chen, Kun, 2021. "Generalized co-sparse factor regression," Computational Statistics & Data Analysis, Elsevier, vol. 157(C).
    12. Liwei Zhang & Yule Zhang & Jia Wu & Xiantao Xiao, 2022. "Solving Stochastic Optimization with Expectation Constraints Efficiently by a Stochastic Augmented Lagrangian-Type Algorithm," INFORMS Journal on Computing, INFORMS, vol. 34(6), pages 2989-3006, November.
    13. David E. Bernal & Zedong Peng & Jan Kronqvist & Ignacio E. Grossmann, 2022. "Alternative regularizations for Outer-Approximation algorithms for convex MINLP," Journal of Global Optimization, Springer, vol. 84(4), pages 807-842, December.
    14. Yunmei Chen & Xiaojing Ye & Wei Zhang, 2020. "Acceleration techniques for level bundle methods in weakly smooth convex constrained optimization," Computational Optimization and Applications, Springer, vol. 77(2), pages 411-432, November.
    15. Shapiro, Alexander, 2021. "Tutorial on risk neutral, distributionally robust and risk averse multistage stochastic programming," European Journal of Operational Research, Elsevier, vol. 288(1), pages 1-13.
    16. Ya-Feng Liu & Xin Liu & Shiqian Ma, 2019. "On the Nonergodic Convergence Rate of an Inexact Augmented Lagrangian Framework for Composite Convex Programming," Mathematics of Operations Research, INFORMS, vol. 44(2), pages 632-650, May.
    17. Vincent Guigues & Renato D. C. Monteiro, 2021. "Stochastic Dynamic Cutting Plane for Multistage Stochastic Convex Programs," Journal of Optimization Theory and Applications, Springer, vol. 189(2), pages 513-559, May.
    18. Butyn, Emerson & Karas, Elizabeth W. & de Oliveira, Welington, 2022. "A derivative-free trust-region algorithm with copula-based models for probability maximization problems," European Journal of Operational Research, Elsevier, vol. 298(1), pages 59-75.
    19. Yang, Yaohong & Zhao, Weihua & Wang, Lei, 2023. "Online regularized matrix regression with streaming data," Computational Statistics & Data Analysis, Elsevier, vol. 187(C).
    20. Wim Ackooij & Welington Oliveira & Yongjia Song, 2019. "On level regularization with normal solutions in decomposition methods for multistage stochastic programming problems," Computational Optimization and Applications, Springer, vol. 74(1), pages 1-42, September.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:coopap:v:89:y:2024:i:3:d:10.1007_s10589-024-00607-2. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.