IDEAS home Printed from https://ideas.repec.org/a/gam/jmathe/v10y2022i21p4069-d959980.html
   My bibliography  Save this article

Classification in High Dimension Using the Ledoit–Wolf Shrinkage Method

Author

Listed:
  • Rasoul Lotfi

    (Department of Statistics, Faculty of Mathematical Sciences, Shahrood University of Technology, Shahrood 3619995161, Iran)

  • Davood Shahsavani

    (Department of Statistics, Faculty of Mathematical Sciences, Shahrood University of Technology, Shahrood 3619995161, Iran)

  • Mohammad Arashi

    (Department of Statistics, Faculty of Mathematical Sciences, Ferdowsi University of Mashhad, Mashhad 9177948974, Iran
    Department of Statistics, Faculty of Natural and Agricultural Sciences, University of Pretoria, Pretoria 0002, South Africa)

Abstract

Classification using linear discriminant analysis (LDA) is challenging when the number of variables is large relative to the number of observations. Algorithms such as LDA require the computation of the feature vector’s precision matrices. In a high-dimension setting, due to the singularity of the covariance matrix, it is not possible to estimate the maximum likelihood estimator of the precision matrix. In this paper, we employ the Stein-type shrinkage estimation of Ledoit and Wolf for high-dimensional data classification. The proposed approach’s efficiency is numerically compared to existing methods, including LDA, cross-validation, gLasso, and SVM. We use the misclassification error criterion for comparison.

Suggested Citation

  • Rasoul Lotfi & Davood Shahsavani & Mohammad Arashi, 2022. "Classification in High Dimension Using the Ledoit–Wolf Shrinkage Method," Mathematics, MDPI, vol. 10(21), pages 1-13, November.
  • Handle: RePEc:gam:jmathe:v:10:y:2022:i:21:p:4069-:d:959980
    as

    Download full text from publisher

    File URL: https://www.mdpi.com/2227-7390/10/21/4069/pdf
    Download Restriction: no

    File URL: https://www.mdpi.com/2227-7390/10/21/4069/
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. Ledoit, Olivier & Wolf, Michael, 2004. "A well-conditioned estimator for large-dimensional covariance matrices," Journal of Multivariate Analysis, Elsevier, vol. 88(2), pages 365-411, February.
    2. Jacob Bien & Robert J. Tibshirani, 2011. "Sparse estimation of a covariance matrix," Biometrika, Biometrika Trust, vol. 98(4), pages 807-820.
    3. T. Tony Cai & Linjun Zhang, 2019. "High dimensional linear discriminant analysis: optimality, adaptive algorithm and missing data," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 81(4), pages 675-705, September.
    4. Choi, Young-Geun & Lim, Johan & Roy, Anindya & Park, Junyong, 2019. "Fixed support positive-definite modification of covariance matrix estimators via linear shrinkage," Journal of Multivariate Analysis, Elsevier, vol. 171(C), pages 234-249.
    5. Rothman, Adam J. & Levina, Elizaveta & Zhu, Ji, 2009. "Generalized Thresholding of Large Covariance Matrices," Journal of the American Statistical Association, American Statistical Association, vol. 104(485), pages 177-186.
    6. Le, Khuyen T. & Chaux, Caroline & Richard, Frédéric J.P. & Guedj, Eric, 2020. "An adapted linear discriminant analysis with variable selection for the classification in high-dimension, and an application to medical data," Computational Statistics & Data Analysis, Elsevier, vol. 152(C).
    7. Norm A. Campbell, 1980. "Shrunken Estimators in Discriminant and Canonical Variate Analysis," Journal of the Royal Statistical Society Series C, Royal Statistical Society, vol. 29(1), pages 5-14, March.
    8. Jianqing Fan & Yuan Liao & Han Liu, 2016. "An overview of the estimation of large covariance and precision matrices," Econometrics Journal, Royal Economic Society, vol. 19(1), pages 1-32, February.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Lam, Clifford, 2020. "High-dimensional covariance matrix estimation," LSE Research Online Documents on Economics 101667, London School of Economics and Political Science, LSE Library.
    2. Wang, Shaoxin, 2021. "An efficient numerical method for condition number constrained covariance matrix approximation," Applied Mathematics and Computation, Elsevier, vol. 397(C).
    3. Kashlak, Adam B., 2021. "Non-asymptotic error controlled sparse high dimensional precision matrix estimation," Journal of Multivariate Analysis, Elsevier, vol. 181(C).
    4. Yan Zhang & Jiyuan Tao & Zhixiang Yin & Guoqiang Wang, 2022. "Improved Large Covariance Matrix Estimation Based on Efficient Convex Combination and Its Application in Portfolio Optimization," Mathematics, MDPI, vol. 10(22), pages 1-15, November.
    5. Gautam Sabnis & Debdeep Pati & Anirban Bhattacharya, 2019. "Compressed Covariance Estimation with Automated Dimension Learning," Sankhya A: The Indian Journal of Statistics, Springer;Indian Statistical Institute, vol. 81(2), pages 466-481, December.
    6. Lee, Kyoungjae & Jo, Seongil & Lee, Jaeyong, 2022. "The beta-mixture shrinkage prior for sparse covariances with near-minimax posterior convergence rate," Journal of Multivariate Analysis, Elsevier, vol. 192(C).
    7. Bailey, Natalia & Pesaran, M. Hashem & Smith, L. Vanessa, 2019. "A multiple testing approach to the regularisation of large sample correlation matrices," Journal of Econometrics, Elsevier, vol. 208(2), pages 507-534.
    8. Maurizio Daniele & Winfried Pohlmeier & Aygul Zagidullina, 2018. "Sparse Approximate Factor Estimation for High-Dimensional Covariance Matrices," Working Paper Series of the Department of Economics, University of Konstanz 2018-07, Department of Economics, University of Konstanz.
    9. Shaoxin Wang & Hu Yang & Chaoli Yao, 2019. "On the penalized maximum likelihood estimation of high-dimensional approximate factor model," Computational Statistics, Springer, vol. 34(2), pages 819-846, June.
    10. Piotr Zwiernik & Caroline Uhler & Donald Richards, 2017. "Maximum likelihood estimation for linear Gaussian covariance models," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 79(4), pages 1269-1292, September.
    11. Choi, Young-Geun & Lim, Johan & Roy, Anindya & Park, Junyong, 2019. "Fixed support positive-definite modification of covariance matrix estimators via linear shrinkage," Journal of Multivariate Analysis, Elsevier, vol. 171(C), pages 234-249.
    12. Aaron J Molstad & Adam J Rothman, 2018. "Shrinking characteristics of precision matrix estimators," Biometrika, Biometrika Trust, vol. 105(3), pages 563-574.
    13. Farnè, Matteo & Montanari, Angela, 2020. "A large covariance matrix estimator under intermediate spikiness regimes," Journal of Multivariate Analysis, Elsevier, vol. 176(C).
    14. Avagyan, Vahe & Alonso Fernández, Andrés Modesto & Nogales, Francisco J., 2015. "D-trace Precision Matrix Estimation Using Adaptive Lasso Penalties," DES - Working Papers. Statistics and Econometrics. WS 21775, Universidad Carlos III de Madrid. Departamento de Estadística.
    15. Benjamin Poignard & Manabu Asai, 2023. "Estimation of high-dimensional vector autoregression via sparse precision matrix," The Econometrics Journal, Royal Economic Society, vol. 26(2), pages 307-326.
    16. Zhou Tang & Zhangsheng Yu & Cheng Wang, 2020. "A fast iterative algorithm for high-dimensional differential network," Computational Statistics, Springer, vol. 35(1), pages 95-109, March.
    17. Zeyu Wu & Cheng Wang & Weidong Liu, 2023. "A unified precision matrix estimation framework via sparse column-wise inverse operator under weak sparsity," Annals of the Institute of Statistical Mathematics, Springer;The Institute of Statistical Mathematics, vol. 75(4), pages 619-648, August.
    18. Daniel Felix Ahelegbey & Luis Carvalho & Eric D. Kolaczyk, 2020. "A Bayesian Covariance Graph And Latent Position Model For Multivariate Financial Time Series," DEM Working Papers Series 181, University of Pavia, Department of Economics and Management.
    19. Yang, Guangren & Liu, Yiming & Pan, Guangming, 2019. "Weighted covariance matrix estimation," Computational Statistics & Data Analysis, Elsevier, vol. 139(C), pages 82-98.
    20. Chen, Jia & Li, Degui & Linton, Oliver, 2019. "A new semiparametric estimation approach for large dynamic covariance matrices with multiple conditioning variables," Journal of Econometrics, Elsevier, vol. 212(1), pages 155-176.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:gam:jmathe:v:10:y:2022:i:21:p:4069-:d:959980. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: MDPI Indexing Manager (email available below). General contact details of provider: https://www.mdpi.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.