IDEAS home Printed from https://ideas.repec.org/a/spr/joptap/v197y2023i2d10.1007_s10957-023-02183-7.html
   My bibliography  Save this article

Memoryless Quasi-Newton Methods Based on the Spectral-Scaling Broyden Family for Riemannian Optimization

Author

Listed:
  • Yasushi Narushima

    (Keio University)

  • Shummin Nakayama

    (The University of Electro-Communications)

  • Masashi Takemura

    (Lattice Technology Co., Ltd.)

  • Hiroshi Yabe

    (Tokyo University of Science)

Abstract

We consider iterative methods for unconstrained optimization on Riemannian manifolds. Though memoryless quasi-Newton methods are effective for large-scale unconstrained optimization in the Euclidean space, they have not been studied over Riemannian manifolds. Therefore, in this paper, we propose a memoryless quasi-Newton method in Riemannian manifolds. The proposed method is based on the spectral-scaling Broyden family with additional modifications to ensure the sufficient descent condition. We present an algorithm for the proposed method that uses the Wolfe line search conditions and show that this algorithm guarantees global convergence. We emphasize that global convergence is guaranteed without any assumptions regarding the convexity of the objective function or the isometric property of the vector transport. In addition, we derive appropriate selections for the parameter vector contained in the proposed method. Numerical experiments are conducted to compare the proposed method with conventional conjugate gradient methods using typical test problems. The results show that the proposed method is superior to the tested conjugate gradient methods.

Suggested Citation

  • Yasushi Narushima & Shummin Nakayama & Masashi Takemura & Hiroshi Yabe, 2023. "Memoryless Quasi-Newton Methods Based on the Spectral-Scaling Broyden Family for Riemannian Optimization," Journal of Optimization Theory and Applications, Springer, vol. 197(2), pages 639-664, May.
  • Handle: RePEc:spr:joptap:v:197:y:2023:i:2:d:10.1007_s10957-023-02183-7
    DOI: 10.1007/s10957-023-02183-7
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s10957-023-02183-7
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s10957-023-02183-7?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Kaori Sugiki & Yasushi Narushima & Hiroshi Yabe, 2012. "Globally Convergent Three-Term Conjugate Gradient Methods that Use Secant Conditions and Generate Descent Search Directions for Unconstrained Optimization," Journal of Optimization Theory and Applications, Springer, vol. 153(3), pages 733-757, June.
    2. David F. Shanno, 1978. "Conjugate Gradient Methods with Inexact Searches," Mathematics of Operations Research, INFORMS, vol. 3(3), pages 244-256, August.
    3. Hiroyuki Sakai & Hideaki Iiduka, 2020. "Hybrid Riemannian conjugate gradient methods with global convergence properties," Computational Optimization and Applications, Springer, vol. 77(3), pages 811-830, December.
    4. Hiroyuki Sato, 2016. "A Dai–Yuan-type Riemannian conjugate gradient method with the weak Wolfe conditions," Computational Optimization and Applications, Springer, vol. 64(1), pages 101-118, May.
    5. Xiaojing Zhu & Hiroyuki Sato, 2020. "Riemannian conjugate gradient methods with inverse retraction," Computational Optimization and Applications, Springer, vol. 77(3), pages 779-810, December.
    6. Mehiddin Al-Baali & Yasushi Narushima & Hiroshi Yabe, 2015. "A family of three-term conjugate gradient methods with sufficient descent property for unconstrained optimization," Computational Optimization and Applications, Springer, vol. 60(1), pages 89-110, January.
    7. Cook, R. Dennis & Forzani, Liliana, 2009. "Likelihood-Based Sufficient Dimension Reduction," Journal of the American Statistical Association, American Statistical Association, vol. 104(485), pages 197-208.
    8. Wenyu Sun & Ya-Xiang Yuan, 2006. "Optimization Theory and Methods," Springer Optimization and Its Applications, Springer, number 978-0-387-24976-6, September.
    9. W. Y. Cheng & D. H. Li, 2010. "Spectral Scaling BFGS Method," Journal of Optimization Theory and Applications, Springer, vol. 146(2), pages 305-319, August.
    10. Yuya Yamakawa & Hiroyuki Sato, 2022. "Sequential optimality conditions for nonlinear optimization on Riemannian manifolds and a globally convergent augmented Lagrangian method," Computational Optimization and Applications, Springer, vol. 81(2), pages 397-421, March.
    11. Hiroyuki Sakai & Hideaki Iiduka, 2021. "Sufficient Descent Riemannian Conjugate Gradient Methods," Journal of Optimization Theory and Applications, Springer, vol. 190(1), pages 130-150, July.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Sakai, Hiroyuki & Sato, Hiroyuki & Iiduka, Hideaki, 2023. "Global convergence of Hager–Zhang type Riemannian conjugate gradient method," Applied Mathematics and Computation, Elsevier, vol. 441(C).
    2. Hiroyuki Sato, 2023. "Riemannian optimization on unit sphere with p-norm and its applications," Computational Optimization and Applications, Springer, vol. 85(3), pages 897-935, July.
    3. Shummin Nakayama & Yasushi Narushima & Hiroshi Yabe, 2021. "Inexact proximal memoryless quasi-Newton methods based on the Broyden family for minimizing composite functions," Computational Optimization and Applications, Springer, vol. 79(1), pages 127-154, May.
    4. Babaie-Kafaki, Saman & Ghanbari, Reza, 2014. "The Dai–Liao nonlinear conjugate gradient method with optimal parameter choices," European Journal of Operational Research, Elsevier, vol. 234(3), pages 625-630.
    5. Xiaojing Zhu & Hiroyuki Sato, 2020. "Riemannian conjugate gradient methods with inverse retraction," Computational Optimization and Applications, Springer, vol. 77(3), pages 779-810, December.
    6. XiaoLiang Dong & Deren Han & Zhifeng Dai & Lixiang Li & Jianguang Zhu, 2018. "An Accelerated Three-Term Conjugate Gradient Method with Sufficient Descent Condition and Conjugacy Condition," Journal of Optimization Theory and Applications, Springer, vol. 179(3), pages 944-961, December.
    7. Auwal Bala Abubakar & Poom Kumam & Aliyu Muhammed Awwal & Phatiphat Thounthong, 2019. "A Modified Self-Adaptive Conjugate Gradient Method for Solving Convex Constrained Monotone Nonlinear Equations for Signal Recovery Problems," Mathematics, MDPI, vol. 7(8), pages 1-24, August.
    8. Saman Babaie-Kafaki & Reza Ghanbari, 2016. "Descent Symmetrization of the Dai–Liao Conjugate Gradient Method," Asia-Pacific Journal of Operational Research (APJOR), World Scientific Publishing Co. Pte. Ltd., vol. 33(02), pages 1-10, April.
    9. Bakhtawar Baluch & Zabidin Salleh & Ahmad Alhawarat & U. A. M. Roslan, 2017. "A New Modified Three-Term Conjugate Gradient Method with Sufficient Descent Property and Its Global Convergence," Journal of Mathematics, Hindawi, vol. 2017, pages 1-12, September.
    10. Saman Babaie-Kafaki, 2012. "A note on the global convergence theorem of the scaled conjugate gradient algorithms proposed by Andrei," Computational Optimization and Applications, Springer, vol. 52(2), pages 409-414, June.
    11. Hiroyuki Sakai & Hideaki Iiduka, 2021. "Sufficient Descent Riemannian Conjugate Gradient Methods," Journal of Optimization Theory and Applications, Springer, vol. 190(1), pages 130-150, July.
    12. Xinchao Luo & Lixing Zhu & Hongtu Zhu, 2016. "Single‐index varying coefficient model for functional responses," Biometrics, The International Biometric Society, vol. 72(4), pages 1275-1284, December.
    13. Li, Junlan & Wang, Tao, 2021. "Dimension reduction in binary response regression: A joint modeling approach," Computational Statistics & Data Analysis, Elsevier, vol. 156(C).
    14. Saha, Tanay & Rakshit, Suman & Khare, Swanand R., 2023. "Linearly structured quadratic model updating using partial incomplete eigendata," Applied Mathematics and Computation, Elsevier, vol. 446(C).
    15. Scrucca, Luca, 2011. "Model-based SIR for dimension reduction," Computational Statistics & Data Analysis, Elsevier, vol. 55(11), pages 3010-3026, November.
    16. Zheng, Sanpeng & Feng, Renzhong, 2023. "A variable projection method for the general radial basis function neural network," Applied Mathematics and Computation, Elsevier, vol. 451(C).
    17. Hiroyuki Sakai & Hideaki Iiduka, 2020. "Hybrid Riemannian conjugate gradient methods with global convergence properties," Computational Optimization and Applications, Springer, vol. 77(3), pages 811-830, December.
    18. Hai-Jun Wang & Qin Ni, 2010. "A Convex Approximation Method For Large Scale Linear Inequality Constrained Minimization," Asia-Pacific Journal of Operational Research (APJOR), World Scientific Publishing Co. Pte. Ltd., vol. 27(01), pages 85-101.
    19. Chen, Liang, 2016. "A high-order modified Levenberg–Marquardt method for systems of nonlinear equations with fourth-order convergence," Applied Mathematics and Computation, Elsevier, vol. 285(C), pages 79-93.
    20. Ji, Li-Qun, 2015. "An assessment of agricultural residue resources for liquid biofuel production in China," Renewable and Sustainable Energy Reviews, Elsevier, vol. 44(C), pages 561-575.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:joptap:v:197:y:2023:i:2:d:10.1007_s10957-023-02183-7. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.