IDEAS home Printed from https://ideas.repec.org/a/spr/joptap/v181y2019i2d10.1007_s10957-019-01475-1.html
   My bibliography  Save this article

An Efficient Gradient Method with Approximately Optimal Stepsize Based on Tensor Model for Unconstrained Optimization

Author

Listed:
  • Zexian Liu

    (Xidian University
    Hezhou University)

  • Hongwei Liu

    (Xidian University)

Abstract

A new type of stepsize, which was recently introduced by Liu et al. (Optimization 67(3):427–440, 2018), is called approximately optimal stepsize and is very efficient for gradient method. Interestingly, all gradient methods can be regarded as gradient methods with approximately optimal stepsizes. In this paper, we present an efficient gradient method with approximately optimal stepsize based on tensor model for unconstrained optimization. In the proposed method, if the objective function is not close to a minimizer and a quadratic function on a line segment between the current and latest iterates, then a tensor model is exploited to generate approximately optimal stepsize for gradient method. Otherwise, quadratic approximation models are constructed to generate approximately optimal stepsizes for gradient method. The global convergence of the proposed method is established under weak conditions. Numerical results indicate that the proposed method is very promising.

Suggested Citation

  • Zexian Liu & Hongwei Liu, 2019. "An Efficient Gradient Method with Approximately Optimal Stepsize Based on Tensor Model for Unconstrained Optimization," Journal of Optimization Theory and Applications, Springer, vol. 181(2), pages 608-633, May.
  • Handle: RePEc:spr:joptap:v:181:y:2019:i:2:d:10.1007_s10957-019-01475-1
    DOI: 10.1007/s10957-019-01475-1
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s10957-019-01475-1
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s10957-019-01475-1?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. J. Z. Zhang & N. Y. Deng & L. H. Chen, 1999. "New Quasi-Newton Equation and Related Methods for Unconstrained Optimization," Journal of Optimization Theory and Applications, Springer, vol. 102(1), pages 147-167, July.
    2. Hongwei Liu & Xiangli Li, 2013. "Modified subspace Barzilai-Borwein gradient method for non-negative matrix factorization," Computational Optimization and Applications, Springer, vol. 55(1), pages 173-196, May.
    3. Gonglin Yuan & Zehong Meng & Yong Li, 2016. "A Modified Hestenes and Stiefel Conjugate Gradient Algorithm for Large-Scale Nonsmooth Minimizations and Nonlinear Equations," Journal of Optimization Theory and Applications, Springer, vol. 168(1), pages 129-152, January.
    4. Yakui Huang & Hongwei Liu, 2016. "Smoothing projected Barzilai–Borwein method for constrained non-Lipschitz optimization," Computational Optimization and Applications, Springer, vol. 65(3), pages 671-698, December.
    5. Gonglin Yuan & Zengxin Wei, 2010. "Convergence analysis of a modified BFGS method on convex minimizations," Computational Optimization and Applications, Springer, vol. 47(2), pages 237-255, October.
    6. Marko Miladinović & Predrag Stanimirović & Sladjana Miljković, 2011. "Scalar Correction Method for Solving Large Scale Unconstrained Minimization Problems," Journal of Optimization Theory and Applications, Springer, vol. 151(2), pages 304-320, November.
    7. Fahimeh Biglari & Maghsud Solimanpur, 2013. "Scaling on the Spectral Gradient Method," Journal of Optimization Theory and Applications, Springer, vol. 158(2), pages 626-635, August.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Zexian Liu & Hongwei Liu & Yu-Hong Dai, 2020. "An improved Dai–Kou conjugate gradient algorithm for unconstrained optimization," Computational Optimization and Applications, Springer, vol. 75(1), pages 145-167, January.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Yong Li & Gonglin Yuan & Zhou Sheng, 2018. "An active-set algorithm for solving large-scale nonsmooth optimization models with box constraints," PLOS ONE, Public Library of Science, vol. 13(1), pages 1-16, January.
    2. Gonglin Yuan & Xiaoliang Wang & Zhou Sheng, 2020. "The Projection Technique for Two Open Problems of Unconstrained Optimization Problems," Journal of Optimization Theory and Applications, Springer, vol. 186(2), pages 590-619, August.
    3. Gonglin Yuan & Zhou Sheng & Wenjie Liu, 2016. "The Modified HZ Conjugate Gradient Algorithm for Large-Scale Nonsmooth Optimization," PLOS ONE, Public Library of Science, vol. 11(10), pages 1-15, October.
    4. Neculai Andrei, 2018. "A Double-Parameter Scaling Broyden–Fletcher–Goldfarb–Shanno Method Based on Minimizing the Measure Function of Byrd and Nocedal for Unconstrained Optimization," Journal of Optimization Theory and Applications, Springer, vol. 178(1), pages 191-218, July.
    5. Hassan Mohammad & Mohammed Yusuf Waziri, 2019. "Structured Two-Point Stepsize Gradient Methods for Nonlinear Least Squares," Journal of Optimization Theory and Applications, Springer, vol. 181(1), pages 298-317, April.
    6. Qi Tian & Xiaoliang Wang & Liping Pang & Mingkun Zhang & Fanyun Meng, 2021. "A New Hybrid Three-Term Conjugate Gradient Algorithm for Large-Scale Unconstrained Problems," Mathematics, MDPI, vol. 9(12), pages 1-13, June.
    7. S. Bojari & M. R. Eslahchi, 2020. "Global convergence of a family of modified BFGS methods under a modified weak-Wolfe–Powell line search for nonconvex functions," 4OR, Springer, vol. 18(2), pages 219-244, June.
    8. Fahimeh Biglari & Farideh Mahmoodpur, 2016. "Scaling Damped Limited-Memory Updates for Unconstrained Optimization," Journal of Optimization Theory and Applications, Springer, vol. 170(1), pages 177-188, July.
    9. Babaie-Kafaki, Saman & Ghanbari, Reza, 2014. "The Dai–Liao nonlinear conjugate gradient method with optimal parameter choices," European Journal of Operational Research, Elsevier, vol. 234(3), pages 625-630.
    10. Jinman Lv & Zhenhua Peng & Zhongping Wan, 2021. "Optimality Conditions, Qualifications and Approximation Method for a Class of Non-Lipschitz Mathematical Programs with Switching Constraints," Mathematics, MDPI, vol. 9(22), pages 1-20, November.
    11. Yu, Yang & Wang, Yu & Deng, Rui & Yin, Yu, 2023. "New DY-HS hybrid conjugate gradient algorithm for solving optimization problem of unsteady partial differential equations with convection term," Mathematics and Computers in Simulation (MATCOM), Elsevier, vol. 208(C), pages 677-701.
    12. Xiaoliang Wang & Liping Pang & Qi Wu & Mingkun Zhang, 2021. "An Adaptive Proximal Bundle Method with Inexact Oracles for a Class of Nonconvex and Nonsmooth Composite Optimization," Mathematics, MDPI, vol. 9(8), pages 1-27, April.
    13. XiaoLiang Dong & Deren Han & Zhifeng Dai & Lixiang Li & Jianguang Zhu, 2018. "An Accelerated Three-Term Conjugate Gradient Method with Sufficient Descent Condition and Conjugacy Condition," Journal of Optimization Theory and Applications, Springer, vol. 179(3), pages 944-961, December.
    14. Morteza Kimiaei & Farzad Rahpeymaii, 2019. "A new nonmonotone line-search trust-region approach for nonlinear systems," TOP: An Official Journal of the Spanish Society of Statistics and Operations Research, Springer;Sociedad de Estadística e Investigación Operativa, vol. 27(2), pages 199-232, July.
    15. Gonglin Yuan & Xiabin Duan & Wenjie Liu & Xiaoliang Wang & Zengru Cui & Zhou Sheng, 2015. "Two New PRP Conjugate Gradient Algorithms for Minimization Optimization Models," PLOS ONE, Public Library of Science, vol. 10(10), pages 1-24, October.
    16. Mehiddin Al-Baali & Humaid Khalfan, 2012. "A combined class of self-scaling and modified quasi-Newton methods," Computational Optimization and Applications, Springer, vol. 52(2), pages 393-408, June.
    17. C. X. Kou & Y. H. Dai, 2015. "A Modified Self-Scaling Memoryless Broyden–Fletcher–Goldfarb–Shanno Method for Unconstrained Optimization," Journal of Optimization Theory and Applications, Springer, vol. 165(1), pages 209-224, April.
    18. Waziri, Mohammed Yusuf & Ahmed, Kabiru & Sabi’u, Jamilu, 2019. "A family of Hager–Zhang conjugate gradient methods for system of monotone nonlinear equations," Applied Mathematics and Computation, Elsevier, vol. 361(C), pages 645-660.
    19. D. Tarzanagh & M. Peyghami, 2015. "A new regularized limited memory BFGS-type method based on modified secant conditions for unconstrained optimization problems," Journal of Global Optimization, Springer, vol. 63(4), pages 709-728, December.
    20. Andrei, Neculai, 2010. "Accelerated scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization," European Journal of Operational Research, Elsevier, vol. 204(3), pages 410-420, August.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:joptap:v:181:y:2019:i:2:d:10.1007_s10957-019-01475-1. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.