IDEAS home Printed from https://ideas.repec.org/a/spr/coopap/v85y2023i2d10.1007_s10589-023-00467-2.html
   My bibliography  Save this article

A semismooth Newton based dual proximal point algorithm for maximum eigenvalue problem

Author

Listed:
  • Yong-Jin Liu

    (Fuzhou University)

  • Jing Yu

    (Fuzhou University)

Abstract

The maximum eigenvalue problem is to minimize the maximum eigenvalue function over an affine subspace in a symmetric matrix space, which has many applications in structural engineering, such as combinatorial optimization, control theory and structural design. Based on classical analysis of proximal point (Ppa) algorithm and semismooth analysis of nonseparable spectral operator, we propose an efficient semismooth Newton based dual proximal point (Ssndppa) algorithm to solve the maximum eigenvalue problem, in which an inexact semismooth Newton (Ssn) algorithm is applied to solve inner subproblem of the dual proximal point (d-Ppa) algorithm. Global convergence and locally asymptotically superlinear convergence of the d-Ppa algorithm are established under very mild conditions, and fast superlinear or even quadratic convergence of the Ssn algorithm is obtained when the primal constraint nondegeneracy condition holds for the inner subproblem. Computational costs of the Ssn algorithm for solving the inner subproblem can be reduced by fully exploiting low-rank or high-rank property of a matrix. Numerical experiments on max-cut problems and randomly generated maximum eigenvalue optimization problems demonstrate that the Ssndppa algorithm substantially outperforms the Sdpnal+ solver and several state-of-the-art first-order algorithms.

Suggested Citation

  • Yong-Jin Liu & Jing Yu, 2023. "A semismooth Newton based dual proximal point algorithm for maximum eigenvalue problem," Computational Optimization and Applications, Springer, vol. 85(2), pages 547-582, June.
  • Handle: RePEc:spr:coopap:v:85:y:2023:i:2:d:10.1007_s10589-023-00467-2
    DOI: 10.1007/s10589-023-00467-2
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s10589-023-00467-2
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s10589-023-00467-2?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Radu Ioan Bot & Dang-Khoa Nguyen, 2020. "The Proximal Alternating Direction Method of Multipliers in the Nonconvex Setting: Convergence Analysis and Rates," Mathematics of Operations Research, INFORMS, vol. 45(2), pages 682-712, May.
    2. R. T. Rockafellar, 1976. "Augmented Lagrangians and Applications of the Proximal Point Algorithm in Convex Programming," Mathematics of Operations Research, INFORMS, vol. 1(2), pages 97-116, May.
    3. A. S. Lewis, 1996. "Derivatives of Spectral Functions," Mathematics of Operations Research, INFORMS, vol. 21(3), pages 576-588, August.
    4. Yong-Jin Liu & Jing Yu, 2022. "A Semismooth Newton-based Augmented Lagrangian Algorithm for Density Matrix Least Squares Problems," Journal of Optimization Theory and Applications, Springer, vol. 195(3), pages 749-779, December.
    5. Xin Chen & Houduo Qi & Liqun Qi & Kok-Lay Teo, 2004. "Smooth Convex Approximation to the Maximum Eigenvalue Function," Journal of Global Optimization, Springer, vol. 30(2), pages 253-270, November.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Yong-Jin Liu & Jing Yu, 2022. "A Semismooth Newton-based Augmented Lagrangian Algorithm for Density Matrix Least Squares Problems," Journal of Optimization Theory and Applications, Springer, vol. 195(3), pages 749-779, December.
    2. Jean-Pierre Crouzeix & Abdelhak Hassouni & Eladio OcaƱa, 2023. "A Short Note on the Twice Differentiability of the Marginal Function of a Convex Function," Journal of Optimization Theory and Applications, Springer, vol. 198(2), pages 857-867, August.
    3. Le Thi Khanh Hien & Duy Nhat Phan & Nicolas Gillis, 2022. "Inertial alternating direction method of multipliers for non-convex non-smooth optimization," Computational Optimization and Applications, Springer, vol. 83(1), pages 247-285, September.
    4. Bingsheng He & Li-Zhi Liao & Xiang Wang, 2012. "Proximal-like contraction methods for monotone variational inequalities in a unified framework I: Effective quadruplet and primary methods," Computational Optimization and Applications, Springer, vol. 51(2), pages 649-679, March.
    5. Xiaoming Yuan, 2011. "An improved proximal alternating direction method for monotone variational inequalities with separable structure," Computational Optimization and Applications, Springer, vol. 49(1), pages 17-29, May.
    6. Zhu, Daoli & Marcotte, Patrice, 1995. "Coupling the auxiliary problem principle with descent methods of pseudoconvex programming," European Journal of Operational Research, Elsevier, vol. 83(3), pages 670-685, June.
    7. Guo, Zhaomiao & Fan, Yueyue, 2017. "A Stochastic Multi-Agent Optimization Model for Energy Infrastructure Planning Under Uncertainty and Competition," Institute of Transportation Studies, Working Paper Series qt89s5s8hn, Institute of Transportation Studies, UC Davis.
    8. Dolgopolik, Maksim V., 2021. "The alternating direction method of multipliers for finding the distance between ellipsoids," Applied Mathematics and Computation, Elsevier, vol. 409(C).
    9. Masoud Ahookhosh & Le Thi Khanh Hien & Nicolas Gillis & Panagiotis Patrinos, 2021. "A Block Inertial Bregman Proximal Algorithm for Nonsmooth Nonconvex Problems with Application to Symmetric Nonnegative Matrix Tri-Factorization," Journal of Optimization Theory and Applications, Springer, vol. 190(1), pages 234-258, July.
    10. R. S. Burachik & S. Scheimberg & B. F. Svaiter, 2001. "Robustness of the Hybrid Extragradient Proximal-Point Algorithm," Journal of Optimization Theory and Applications, Springer, vol. 111(1), pages 117-136, October.
    11. A. F. Izmailov & M. V. Solodov, 2022. "Perturbed Augmented Lagrangian Method Framework with Applications to Proximal and Smoothed Variants," Journal of Optimization Theory and Applications, Springer, vol. 193(1), pages 491-522, June.
    12. Xin Chen & Houduo Qi & Liqun Qi & Kok-Lay Teo, 2004. "Smooth Convex Approximation to the Maximum Eigenvalue Function," Journal of Global Optimization, Springer, vol. 30(2), pages 253-270, November.
    13. M. Kyono & M. Fukushima, 2000. "Nonlinear Proximal Decomposition Method for Convex Programming," Journal of Optimization Theory and Applications, Springer, vol. 106(2), pages 357-372, August.
    14. Ya-Feng Liu & Xin Liu & Shiqian Ma, 2019. "On the Nonergodic Convergence Rate of an Inexact Augmented Lagrangian Framework for Composite Convex Programming," Mathematics of Operations Research, INFORMS, vol. 44(2), pages 632-650, May.
    15. J. R. Birge & L. Qi & Z. Wei, 1998. "Convergence Analysis of Some Methods for Minimizing a Nonsmooth Convex Function," Journal of Optimization Theory and Applications, Springer, vol. 97(2), pages 357-383, May.
    16. Bingsheng He & Li-Zhi Liao & Xiang Wang, 2012. "Proximal-like contraction methods for monotone variational inequalities in a unified framework II: general methods and numerical experiments," Computational Optimization and Applications, Springer, vol. 51(2), pages 681-708, March.
    17. Jonathan Eckstein, 2017. "A Simplified Form of Block-Iterative Operator Splitting and an Asynchronous Algorithm Resembling the Multi-Block Alternating Direction Method of Multipliers," Journal of Optimization Theory and Applications, Springer, vol. 173(1), pages 155-182, April.
    18. Eyal Cohen & Nadav Hallak & Marc Teboulle, 2022. "A Dynamic Alternating Direction of Multipliers for Nonconvex Minimization with Nonlinear Functional Equality Constraints," Journal of Optimization Theory and Applications, Springer, vol. 193(1), pages 324-353, June.
    19. N. El Farouq & G. Cohen, 1998. "Progressive Regularization of Variational Inequalities and Decomposition Algorithms," Journal of Optimization Theory and Applications, Springer, vol. 97(2), pages 407-433, May.
    20. Bingsheng He & Min Tao & Xiaoming Yuan, 2017. "Convergence Rate Analysis for the Alternating Direction Method of Multipliers with a Substitution Procedure for Separable Convex Programming," Mathematics of Operations Research, INFORMS, vol. 42(3), pages 662-691, August.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:coopap:v:85:y:2023:i:2:d:10.1007_s10589-023-00467-2. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.