IDEAS home Printed from https://ideas.repec.org/a/gam/jmathe/v13y2025i8p1319-d1636950.html
   My bibliography  Save this article

A Chebyshev–Halley Method with Gradient Regularization and an Improved Convergence Rate

Author

Listed:
  • Jianyu Xiao

    (Beijing Institute for Scientific and Engineering Computing, Beijing University of Technology, Beijing 100124, China)

  • Haibin Zhang

    (Beijing Institute for Scientific and Engineering Computing, Beijing University of Technology, Beijing 100124, China)

  • Huan Gao

    (College of Mathematics and Computational Science, Hunan First Normal University, Changsha 410205, China)

Abstract

High-order methods are particularly crucial for achieving highly accurate solutions or satisfying high-order optimality conditions. However, most existing high-order methods require solving complex high-order Taylor polynomial models, which pose significant computational challenges. In this paper, we propose a Chebyshev–Halley method with gradient regularization, which retains the convergence advantages of high-order methods while effectively addressing computational challenges in polynomial model solving. The proposed method incorporates a quadratic regularization term with an adaptive parameter proportional to a certain power of the gradient norm, thereby ensuring a closed-form solution at each iteration. In theory, the method achieves a global convergence rate of O ( k − 3 ) or even O ( k − 5 ) , attaining the optimal rate of third-order methods without requiring additional acceleration techniques. Moreover, it maintains local superlinear convergence for strongly convex functions. Numerical experiments demonstrate that the proposed method compares favorably with similar methods in terms of efficiency and applicability.

Suggested Citation

  • Jianyu Xiao & Haibin Zhang & Huan Gao, 2025. "A Chebyshev–Halley Method with Gradient Regularization and an Improved Convergence Rate," Mathematics, MDPI, vol. 13(8), pages 1-17, April.
  • Handle: RePEc:gam:jmathe:v:13:y:2025:i:8:p:1319-:d:1636950
    as

    Download full text from publisher

    File URL: https://www.mdpi.com/2227-7390/13/8/1319/pdf
    Download Restriction: no

    File URL: https://www.mdpi.com/2227-7390/13/8/1319/
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. Nesterov, Yurii, 2023. "Inexact accelerated high-order proximal-point methods," LIDAM Reprints CORE 3252, Université catholique de Louvain, Center for Operations Research and Econometrics (CORE).
    2. Nicholas I. M. Gould & Tyrone Rees & Jennifer A. Scott, 2019. "Convergence and evaluation-complexity analysis of a regularized tensor-Newton method for solving nonlinear least-squares problems," Computational Optimization and Applications, Springer, vol. 73(1), pages 1-35, May.
    3. Jianyu Xiao & Haibin Zhang & Huan Gao, 2023. "An Accelerated Regularized Chebyshev–Halley Method for Unconstrained Optimization," Asia-Pacific Journal of Operational Research (APJOR), World Scientific Publishing Co. Pte. Ltd., vol. 40(04), pages 1-11, August.
    4. NESTEROV, Yurii & POLYAK, B.T., 2006. "Cubic regularization of Newton method and its global performance," LIDAM Reprints CORE 1927, Université catholique de Louvain, Center for Operations Research and Econometrics (CORE).
    5. Bo Jiang & Haoyue Wang & Shuzhong Zhang, 2021. "An Optimal High-Order Tensor Method for Convex Optimization," Mathematics of Operations Research, INFORMS, vol. 46(4), pages 1390-1412, November.
    6. Yamakawa, Yuya & Yamashita, Nobuo, 2025. "Convergence analysis of a regularized Newton method with generalized regularization terms for unconstrained convex optimization problems," Applied Mathematics and Computation, Elsevier, vol. 491(C).
    7. Kenji Ueda & Nobuo Yamashita, 2014. "A regularized Newton method without line search for unconstrained optimization," Computational Optimization and Applications, Springer, vol. 59(1), pages 321-351, October.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Xihua Zhu & Jiangze Han & Bo Jiang, 2022. "An adaptive high order method for finding third-order critical points of nonconvex optimization," Journal of Global Optimization, Springer, vol. 84(2), pages 369-392, October.
    2. Silvia Berra & Alessandro Torraca & Federico Benvenuto & Sara Sommariva, 2024. "Combined Newton-Gradient Method for Constrained Root-Finding in Chemical Reaction Networks," Journal of Optimization Theory and Applications, Springer, vol. 200(1), pages 404-427, January.
    3. Ariizumi, Shumpei & Yamakawa, Yuya & Yamashita, Nobuo, 2024. "Convergence properties of Levenberg–Marquardt methods with generalized regularization terms," Applied Mathematics and Computation, Elsevier, vol. 463(C).
    4. Seonho Park & Seung Hyun Jung & Panos M. Pardalos, 2020. "Combining Stochastic Adaptive Cubic Regularization with Negative Curvature for Nonconvex Optimization," Journal of Optimization Theory and Applications, Springer, vol. 184(3), pages 953-971, March.
    5. Weiwei Kong & Jefferson G. Melo & Renato D. C. Monteiro, 2020. "An efficient adaptive accelerated inexact proximal point method for solving linearly constrained nonconvex composite problems," Computational Optimization and Applications, Springer, vol. 76(2), pages 305-346, June.
    6. Chuan He & Heng Huang & Zhaosong Lu, 2024. "A Newton-CG based barrier-augmented Lagrangian method for general nonconvex conic optimization," Computational Optimization and Applications, Springer, vol. 89(3), pages 843-894, December.
    7. Geovani Nunes Grapiglia & Jinyun Yuan & Ya-xiang Yuan, 2016. "Nonlinear Stepsize Control Algorithms: Complexity Bounds for First- and Second-Order Optimality," Journal of Optimization Theory and Applications, Springer, vol. 171(3), pages 980-997, December.
    8. Yassine Nabou & Ion Necoara, 2024. "Efficiency of higher-order algorithms for minimizing composite functions," Computational Optimization and Applications, Springer, vol. 87(2), pages 441-473, March.
    9. Kenji Ueda & Nobuo Yamashita, 2012. "Global Complexity Bound Analysis of the Levenberg–Marquardt Method for Nonsmooth Equations and Its Application to the Nonlinear Complementarity Problem," Journal of Optimization Theory and Applications, Springer, vol. 152(2), pages 450-467, February.
    10. Yamakawa, Yuya & Yamashita, Nobuo, 2025. "Convergence analysis of a regularized Newton method with generalized regularization terms for unconstrained convex optimization problems," Applied Mathematics and Computation, Elsevier, vol. 491(C).
    11. Tuyen Trung Truong & Tat Dat To & Hang-Tuan Nguyen & Thu Hang Nguyen & Hoang Phuong Nguyen & Maged Helmy, 2023. "A Fast and Simple Modification of Newton’s Method Avoiding Saddle Points," Journal of Optimization Theory and Applications, Springer, vol. 199(2), pages 805-830, November.
    12. Kenji Ueda & Nobuo Yamashita, 2014. "A regularized Newton method without line search for unconstrained optimization," Computational Optimization and Applications, Springer, vol. 59(1), pages 321-351, October.
    13. Liaoyuan Zeng & Ting Kei Pong, 2022. "$$\rho$$ ρ -regularization subproblems: strong duality and an eigensolver-based algorithm," Computational Optimization and Applications, Springer, vol. 81(2), pages 337-368, March.
    14. Paul Armand & Ngoc Nguyen Tran, 2021. "Local Convergence Analysis of a Primal–Dual Method for Bound-Constrained Optimization Without SOSC," Journal of Optimization Theory and Applications, Springer, vol. 189(1), pages 96-116, April.
    15. J. M. Martínez & L. T. Santos, 2022. "On large-scale unconstrained optimization and arbitrary regularization," Computational Optimization and Applications, Springer, vol. 81(1), pages 1-30, January.
    16. A. L. Custódio & R. Garmanjani & M. Raydan, 2024. "Derivative-free separable quadratic modeling and cubic regularization for unconstrained optimization," 4OR, Springer, vol. 22(1), pages 121-144, March.
    17. Yuning Jiang & Dimitris Kouzoupis & Haoyu Yin & Moritz Diehl & Boris Houska, 2021. "Decentralized Optimization Over Tree Graphs," Journal of Optimization Theory and Applications, Springer, vol. 189(2), pages 384-407, May.
    18. Nesterov, Yurii, 2022. "Quartic Regularity," LIDAM Discussion Papers CORE 2022001, Université catholique de Louvain, Center for Operations Research and Econometrics (CORE).
    19. Fedor Stonyakin & Ilya Kuruzov & Boris Polyak, 2023. "Stopping Rules for Gradient Methods for Non-convex Problems with Additive Noise in Gradient," Journal of Optimization Theory and Applications, Springer, vol. 198(2), pages 531-551, August.
    20. Simeon vom Dahl & Christian Kanzow, 2024. "An inexact regularized proximal Newton method without line search," Computational Optimization and Applications, Springer, vol. 89(3), pages 585-624, December.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:gam:jmathe:v:13:y:2025:i:8:p:1319-:d:1636950. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: MDPI Indexing Manager (email available below). General contact details of provider: https://www.mdpi.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.