IDEAS home Printed from https://ideas.repec.org/a/spr/joptap/v132y2007i2d10.1007_s10957-006-9081-0.html
   My bibliography  Save this article

Globally Convergent Optimization Algorithms on Riemannian Manifolds: Uniform Framework for Unconstrained and Constrained Optimization

Author

Listed:
  • Y. Yang

    (Orbital Sciences Corporation)

Abstract

This paper proposes several globally convergent geometric optimization algorithms on Riemannian manifolds, which extend some existing geometric optimization techniques. Since any set of smooth constraints in the Euclidean space R n (corresponding to constrained optimization) and the R n space itself (corresponding to unconstrained optimization) are both special Riemannian manifolds, and since these algorithms are developed on general Riemannian manifolds, the techniques discussed in this paper provide a uniform framework for constrained and unconstrained optimization problems. Unlike some earlier works, the new algorithms have less restrictions in both convergence results and in practice. For example, global minimization in the one-dimensional search is not required. All the algorithms addressed in this paper are globally convergent. For some special Riemannian manifold other than R n , the new algorithms are very efficient. Convergence rates are obtained. Applications are discussed.

Suggested Citation

  • Y. Yang, 2007. "Globally Convergent Optimization Algorithms on Riemannian Manifolds: Uniform Framework for Unconstrained and Constrained Optimization," Journal of Optimization Theory and Applications, Springer, vol. 132(2), pages 245-265, February.
  • Handle: RePEc:spr:joptap:v:132:y:2007:i:2:d:10.1007_s10957-006-9081-0
    DOI: 10.1007/s10957-006-9081-0
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s10957-006-9081-0
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s10957-006-9081-0?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. David G. Luenberger, 1972. "The Gradient Projection Method Along Geodesics," Management Science, INFORMS, vol. 18(11), pages 620-631, July.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Xiao-bo Li & Nan-jing Huang & Qamrul Hasan Ansari & Jen-Chih Yao, 2019. "Convergence Rate of Descent Method with New Inexact Line-Search on Riemannian Manifolds," Journal of Optimization Theory and Applications, Springer, vol. 180(3), pages 830-854, March.
    2. X. M. Wang & C. Li & J. C. Yao, 2015. "Subgradient Projection Algorithms for Convex Feasibility on Riemannian Manifolds with Lower Bounded Curvatures," Journal of Optimization Theory and Applications, Springer, vol. 164(1), pages 202-217, January.
    3. X. M. Wang & J. H. Wang & C. Li, 2023. "Convergence of Inexact Steepest Descent Algorithm for Multiobjective Optimizations on Riemannian Manifolds Without Curvature Constraints," Journal of Optimization Theory and Applications, Springer, vol. 198(1), pages 187-214, July.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. João Carlos de O. Souza, 2018. "Proximal Point Methods for Lipschitz Functions on Hadamard Manifolds: Scalar and Vectorial Cases," Journal of Optimization Theory and Applications, Springer, vol. 179(3), pages 745-760, December.
    2. X. M. Wang & C. Li & J. C. Yao, 2015. "Subgradient Projection Algorithms for Convex Feasibility on Riemannian Manifolds with Lower Bounded Curvatures," Journal of Optimization Theory and Applications, Springer, vol. 164(1), pages 202-217, January.
    3. Jingyang Zhou & Kok Teo & Di Zhou & Guohui Zhao, 2012. "Nonlinear optimal feedback control for lunar module soft landing," Journal of Global Optimization, Springer, vol. 52(2), pages 211-227, February.
    4. Glaydston Carvalho Bento & Sandro Dimy Barbosa Bitar & João Xavier Cruz Neto & Paulo Roberto Oliveira & João Carlos Oliveira Souza, 2019. "Computing Riemannian Center of Mass on Hadamard Manifolds," Journal of Optimization Theory and Applications, Springer, vol. 183(3), pages 977-992, December.
    5. Glaydston C. Bento & Orizon P. Ferreira & Jefferson G. Melo, 2017. "Iteration-Complexity of Gradient, Subgradient and Proximal Point Methods on Riemannian Manifolds," Journal of Optimization Theory and Applications, Springer, vol. 173(2), pages 548-562, May.
    6. Jing Wang & Huafei Sun & Simone Fiori, 2019. "Empirical Means on Pseudo-Orthogonal Groups," Mathematics, MDPI, vol. 7(10), pages 1-20, October.
    7. Orizon P. Ferreira & Mauricio S. Louzeiro & Leandro F. Prudente, 2020. "Iteration-Complexity and Asymptotic Analysis of Steepest Descent Method for Multiobjective Optimization on Riemannian Manifolds," Journal of Optimization Theory and Applications, Springer, vol. 184(2), pages 507-533, February.
    8. Fiori, Simone, 2016. "A Riemannian steepest descent approach over the inhomogeneous symplectic group: Application to the averaging of linear optical systems," Applied Mathematics and Computation, Elsevier, vol. 283(C), pages 251-264.
    9. Teles A. Fernandes & Orizon P. Ferreira & Jinyun Yuan, 2017. "On the Superlinear Convergence of Newton’s Method on Riemannian Manifolds," Journal of Optimization Theory and Applications, Springer, vol. 173(3), pages 828-843, June.
    10. P.-A. Absil & I. Oseledets, 2015. "Low-rank retractions: a survey and new results," Computational Optimization and Applications, Springer, vol. 62(1), pages 5-29, September.
    11. Dewei Zhang & Sam Davanloo Tajbakhsh, 2023. "Riemannian Stochastic Variance-Reduced Cubic Regularized Newton Method for Submanifold Optimization," Journal of Optimization Theory and Applications, Springer, vol. 196(1), pages 324-361, January.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:joptap:v:132:y:2007:i:2:d:10.1007_s10957-006-9081-0. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.