IDEAS home Printed from https://ideas.repec.org/a/eee/ejores/v204y2010i3p410-420.html
   My bibliography  Save this article

Accelerated scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization

Author

Listed:
  • Andrei, Neculai

Abstract

An accelerated scaled memoryless BFGS preconditioned conjugate gradient algorithm for solving unconstrained optimization problems is presented. The basic idea is to combine the scaled memoryless BFGS method and the preconditioning technique in the frame of the conjugate gradient method. The preconditioner, which is also a scaled memoryless BFGS matrix, is reset when the Beale-Powell restart criterion holds. The parameter scaling the gradient is selected as a spectral gradient. For the steplength computation the method has the advantage that in conjugate gradient algorithms the step lengths may differ from 1 by two order of magnitude and tend to vary unpredictably. Thus, we suggest an acceleration scheme able to improve the efficiency of the algorithm. Under common assumptions, the method is proved to be globally convergent. It is shown that for uniformly convex functions the convergence of the accelerated algorithm is still linear, but the reduction in the function values is significantly improved. In mild conditions the algorithm is globally convergent for strongly convex functions. Computational results for a set consisting of 750 unconstrained optimization test problems show that this new accelerated scaled conjugate gradient algorithm substantially outperforms known conjugate gradient methods: SCALCG [3], [4], [5] and [6], CONMIN by Shanno and Phua (1976, 1978) [42] and [43], Hestenes and Stiefel (1952) [25], Polak-Ribiére-Polyak (1969) [32] and [33], Dai and Yuan (2001) [17], Dai and Liao (2001) (t=1) [14], conjugate gradient with sufficient descent condition [7], hybrid Dai and Yuan (2001) [17], hybrid Dai and Yuan zero (2001) [17], CG_DESCENT by Hager and Zhang (2005, 2006) [22] and [23], as well as quasi-Newton LBFGS method [26] and truncated Newton method by Nash (1985) [27].

Suggested Citation

  • Andrei, Neculai, 2010. "Accelerated scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization," European Journal of Operational Research, Elsevier, vol. 204(3), pages 410-420, August.
  • Handle: RePEc:eee:ejores:v:204:y:2010:i:3:p:410-420
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0377-2217(09)00903-5
    Download Restriction: Full text for ScienceDirect subscribers only
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. J. Z. Zhang & N. Y. Deng & L. H. Chen, 1999. "New Quasi-Newton Equation and Related Methods for Unconstrained Optimization," Journal of Optimization Theory and Applications, Springer, vol. 102(1), pages 147-167, July.
    2. Y.H. Dai & Y. Yuan, 2001. "An Efficient Hybrid Conjugate Gradient Method for Unconstrained Optimization," Annals of Operations Research, Springer, vol. 103(1), pages 33-47, March.
    3. Avinoam Perry, 1977. "A Class of Conjugate Gradient Algorithms with a Two-Step Variable Metric Memory," Discussion Papers 269, Northwestern University, Center for Mathematical Studies in Economics and Management Science.
    4. David F. Shanno, 1978. "Conjugate Gradient Methods with Inexact Searches," Mathematics of Operations Research, INFORMS, vol. 3(3), pages 244-256, August.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. XiaoLiang Dong & Deren Han & Zhifeng Dai & Lixiang Li & Jianguang Zhu, 2018. "An Accelerated Three-Term Conjugate Gradient Method with Sufficient Descent Condition and Conjugacy Condition," Journal of Optimization Theory and Applications, Springer, vol. 179(3), pages 944-961, December.
    2. N. Mahdavi-Amiri & M. Shaeiri, 2020. "A conjugate gradient sampling method for nonsmooth optimization," 4OR, Springer, vol. 18(1), pages 73-90, March.
    3. Parvaneh Faramarzi & Keyvan Amini, 2019. "A Modified Spectral Conjugate Gradient Method with Global Convergence," Journal of Optimization Theory and Applications, Springer, vol. 182(2), pages 667-690, August.
    4. Shi, Zhenjun & Wang, Shengquan, 2011. "Nonmonotone adaptive trust region method," European Journal of Operational Research, Elsevier, vol. 208(1), pages 28-36, January.
    5. Saman Babaie-Kafaki, 2012. "A note on the global convergence theorem of the scaled conjugate gradient algorithms proposed by Andrei," Computational Optimization and Applications, Springer, vol. 52(2), pages 409-414, June.
    6. Saman Babaie-Kafaki & Reza Ghanbari, 2017. "A class of adaptive Dai–Liao conjugate gradient methods based on the scaled memoryless BFGS update," 4OR, Springer, vol. 15(1), pages 85-92, March.
    7. Zhang, Ruijun & Lu, Jie & Zhang, Guangquan, 2011. "A knowledge-based multi-role decision support system for ore blending cost optimization of blast furnaces," European Journal of Operational Research, Elsevier, vol. 215(1), pages 194-203, November.
    8. Saman Babaie-Kafaki, 2015. "On Optimality of the Parameters of Self-Scaling Memoryless Quasi-Newton Updating Formulae," Journal of Optimization Theory and Applications, Springer, vol. 167(1), pages 91-101, October.
    9. Qing-Rui He & Chun-Rong Chen & Sheng-Jie Li, 2023. "Spectral conjugate gradient methods for vector optimization problems," Computational Optimization and Applications, Springer, vol. 86(2), pages 457-489, November.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. C. X. Kou & Y. H. Dai, 2015. "A Modified Self-Scaling Memoryless Broyden–Fletcher–Goldfarb–Shanno Method for Unconstrained Optimization," Journal of Optimization Theory and Applications, Springer, vol. 165(1), pages 209-224, April.
    2. B. Sellami & Y. Chaib, 2016. "A new family of globally convergent conjugate gradient methods," Annals of Operations Research, Springer, vol. 241(1), pages 497-513, June.
    3. Neculai Andrei, 2013. "Another Conjugate Gradient Algorithm with Guaranteed Descent and Conjugacy Conditions for Large-scale Unconstrained Optimization," Journal of Optimization Theory and Applications, Springer, vol. 159(1), pages 159-182, October.
    4. Yasushi Narushima & Shummin Nakayama & Masashi Takemura & Hiroshi Yabe, 2023. "Memoryless Quasi-Newton Methods Based on the Spectral-Scaling Broyden Family for Riemannian Optimization," Journal of Optimization Theory and Applications, Springer, vol. 197(2), pages 639-664, May.
    5. Elena Tovbis & Vladimir Krutikov & Predrag Stanimirović & Vladimir Meshechkin & Aleksey Popov & Lev Kazakovtsev, 2023. "A Family of Multi-Step Subgradient Minimization Methods," Mathematics, MDPI, vol. 11(10), pages 1-24, May.
    6. Fahimeh Biglari & Farideh Mahmoodpur, 2016. "Scaling Damped Limited-Memory Updates for Unconstrained Optimization," Journal of Optimization Theory and Applications, Springer, vol. 170(1), pages 177-188, July.
    7. Hiroyuki Sakai & Hideaki Iiduka, 2020. "Hybrid Riemannian conjugate gradient methods with global convergence properties," Computational Optimization and Applications, Springer, vol. 77(3), pages 811-830, December.
    8. Babaie-Kafaki, Saman & Ghanbari, Reza, 2014. "The Dai–Liao nonlinear conjugate gradient method with optimal parameter choices," European Journal of Operational Research, Elsevier, vol. 234(3), pages 625-630.
    9. Serge Gratton & Vincent Malmedy & Philippe Toint, 2012. "Using approximate secant equations in limited memory methods for multilevel unconstrained optimization," Computational Optimization and Applications, Springer, vol. 51(3), pages 967-979, April.
    10. Yu, Yang & Wang, Yu & Deng, Rui & Yin, Yu, 2023. "New DY-HS hybrid conjugate gradient algorithm for solving optimization problem of unsteady partial differential equations with convection term," Mathematics and Computers in Simulation (MATCOM), Elsevier, vol. 208(C), pages 677-701.
    11. M. Fatemi, 2016. "An Optimal Parameter for Dai–Liao Family of Conjugate Gradient Methods," Journal of Optimization Theory and Applications, Springer, vol. 169(2), pages 587-605, May.
    12. Fischer, Manfred M. & Staufer, Petra, 1998. "Optimization in an Error Backpropagation Neural Network Environment with a Performance Test on a Pattern Classification Problem," MPRA Paper 77810, University Library of Munich, Germany.
    13. Yong Li & Gonglin Yuan & Zhou Sheng, 2018. "An active-set algorithm for solving large-scale nonsmooth optimization models with box constraints," PLOS ONE, Public Library of Science, vol. 13(1), pages 1-16, January.
    14. N. Andrei, 2009. "Hybrid Conjugate Gradient Algorithm for Unconstrained Optimization," Journal of Optimization Theory and Applications, Springer, vol. 141(2), pages 249-264, May.
    15. Jose Giovany Babativa-Márquez & José Luis Vicente-Villardón, 2021. "Logistic Biplot by Conjugate Gradient Algorithms and Iterated SVD," Mathematics, MDPI, vol. 9(16), pages 1-19, August.
    16. Ke-Lin Du & Chi-Sing Leung & Wai Ho Mow & M. N. S. Swamy, 2022. "Perceptron: Learning, Generalization, Model Selection, Fault Tolerance, and Role in the Deep Learning Era," Mathematics, MDPI, vol. 10(24), pages 1-46, December.
    17. Gonglin Yuan & Xiaoliang Wang & Zhou Sheng, 2020. "The Projection Technique for Two Open Problems of Unconstrained Optimization Problems," Journal of Optimization Theory and Applications, Springer, vol. 186(2), pages 590-619, August.
    18. Nash, John C. & Varadhan, Ravi, 2011. "Unifying Optimization Algorithms to Aid Software System Users: optimx for R," Journal of Statistical Software, Foundation for Open Access Statistics, vol. 43(i09).
    19. Jinbao Jian & Lin Yang & Xianzhen Jiang & Pengjie Liu & Meixing Liu, 2020. "A Spectral Conjugate Gradient Method with Descent Property," Mathematics, MDPI, vol. 8(2), pages 1-13, February.
    20. N. Mahdavi-Amiri & M. Shaeiri, 2020. "A conjugate gradient sampling method for nonsmooth optimization," 4OR, Springer, vol. 18(1), pages 73-90, March.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:ejores:v:204:y:2010:i:3:p:410-420. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/locate/eor .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.