IDEAS home Printed from https://ideas.repec.org/a/spr/compst/v38y2023i3d10.1007_s00180-022-01277-6.html
   My bibliography  Save this article

Polynomial whitening for high-dimensional data

Author

Listed:
  • Jonathan Gillard

    (Cardiff University)

  • Emily O’Riordan

    (Cardiff University)

  • Anatoly Zhigljavsky

    (Cardiff University)

Abstract

The inverse square root of a covariance matrix is often desirable for performing data whitening in the process of applying many common multivariate data analysis methods. Direct calculation of the inverse square root is not available when the covariance matrix is either singular or nearly singular, as often occurs in high dimensions. We develop new methods, which we broadly call polynomial whitening, to construct a low-degree polynomial in the empirical covariance matrix which has similar properties to the true inverse square root of the covariance matrix (should it exist). Our method does not suffer in singular or near-singular settings, and is computationally tractable in high dimensions. We demonstrate that our construction of low-degree polynomials provides a good substitute for high-dimensional inverse square root covariance matrices, in both $$d

Suggested Citation

  • Jonathan Gillard & Emily O’Riordan & Anatoly Zhigljavsky, 2023. "Polynomial whitening for high-dimensional data," Computational Statistics, Springer, vol. 38(3), pages 1427-1461, September.
  • Handle: RePEc:spr:compst:v:38:y:2023:i:3:d:10.1007_s00180-022-01277-6
    DOI: 10.1007/s00180-022-01277-6
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s00180-022-01277-6
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s00180-022-01277-6?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Ledoit, Olivier & Wolf, Michael, 2004. "A well-conditioned estimator for large-dimensional covariance matrices," Journal of Multivariate Analysis, Elsevier, vol. 88(2), pages 365-411, February.
    2. Peter Hall & J. S. Marron & Amnon Neeman, 2005. "Geometric representation of high dimension, low sample size data," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 67(3), pages 427-444, June.
    3. Yata, Kazuyoshi & Aoshima, Makoto, 2013. "PCA consistency for the power spiked model in high-dimensional settings," Journal of Multivariate Analysis, Elsevier, vol. 122(C), pages 334-354.
    4. Yata, Kazuyoshi & Aoshima, Makoto, 2012. "Effective PCA for high-dimension, low-sample-size data with noise reduction via geometric representations," Journal of Multivariate Analysis, Elsevier, vol. 105(1), pages 193-215.
    5. Jana Janková & Sara Geer, 2017. "Honest confidence regions and optimality in high-dimensional precision matrix estimation," TEST: An Official Journal of the Spanish Society of Statistics and Operations Research, Springer;Sociedad de Estadística e Investigación Operativa, vol. 26(1), pages 143-162, March.
    6. Bodnar, Taras & Dette, Holger & Parolya, Nestor, 2016. "Spectral analysis of the Moore–Penrose inverse of a large dimensional sample covariance matrix," Journal of Multivariate Analysis, Elsevier, vol. 148(C), pages 160-172.
    7. Claudio Agostinelli & Luca Greco, 2019. "Weighted likelihood estimation of multivariate location and scatter," TEST: An Official Journal of the Spanish Society of Statistics and Operations Research, Springer;Sociedad de Estadística e Investigación Operativa, vol. 28(3), pages 756-784, September.
    8. Tsubasa Ito & Tatsuya Kubokawa, 2015. "Linear Ridge Estimator of High-Dimensional Precision Matrix Using Random Matrix Theory ," CIRJE F-Series CIRJE-F-995, CIRJE, Faculty of Economics, University of Tokyo.
    9. Agnan Kessy & Alex Lewin & Korbinian Strimmer, 2018. "Optimal Whitening and Decorrelation," The American Statistician, Taylor & Francis Journals, vol. 72(4), pages 309-314, October.
    10. Pronzato, Luc & Wynn, Henry P. & Zhigljavsky, Anatoly A., 2018. "Simplicial variances, potentials and Mahalanobis distances," Journal of Multivariate Analysis, Elsevier, vol. 168(C), pages 276-289.
    11. Jushan Bai & Shuzhong Shi, 2011. "Estimating High Dimensional Covariance Matrices and its Applications," Annals of Economics and Finance, Society for AEF, vol. 12(2), pages 199-215, November.
    12. Zhiguo Xiao, 2020. "Efficient GMM estimation with singular system of moment conditions," Statistical Theory and Related Fields, Taylor & Francis Journals, vol. 4(2), pages 172-178, July.
    13. Lawrence Hubert & Phipps Arabie, 1985. "Comparing partitions," Journal of Classification, Springer;The Classification Society, vol. 2(1), pages 193-218, December.
    14. Yata, Kazuyoshi & Aoshima, Makoto, 2010. "Effective PCA for high-dimension, low-sample-size data with singular value decomposition of cross data matrix," Journal of Multivariate Analysis, Elsevier, vol. 101(9), pages 2060-2077, October.
    15. Fisher, Thomas J. & Sun, Xiaoqian, 2011. "Improved Stein-type shrinkage estimators for the high-dimensional multivariate normal covariance matrix," Computational Statistics & Data Analysis, Elsevier, vol. 55(5), pages 1909-1918, May.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Kazuyoshi Yata & Makoto Aoshima, 2020. "Geometric consistency of principal component scores for high‐dimensional mixture models and its application," Scandinavian Journal of Statistics, Danish Society for Theoretical Statistics;Finnish Statistical Society;Norwegian Statistical Association;Swedish Statistical Association, vol. 47(3), pages 899-921, September.
    2. Wang, Shao-Hsuan & Huang, Su-Yun & Chen, Ting-Li, 2020. "On asymptotic normality of cross data matrix-based PCA in high dimension low sample size," Journal of Multivariate Analysis, Elsevier, vol. 175(C).
    3. Makoto Aoshima & Kazuyoshi Yata, 2019. "Distance-based classifier by data transformation for high-dimension, strongly spiked eigenvalue models," Annals of the Institute of Statistical Mathematics, Springer;The Institute of Statistical Mathematics, vol. 71(3), pages 473-503, June.
    4. Yata, Kazuyoshi & Aoshima, Makoto, 2013. "PCA consistency for the power spiked model in high-dimensional settings," Journal of Multivariate Analysis, Elsevier, vol. 122(C), pages 334-354.
    5. Wang, Shao-Hsuan & Huang, Su-Yun, 2022. "Perturbation theory for cross data matrix-based PCA," Journal of Multivariate Analysis, Elsevier, vol. 190(C).
    6. Tatsuya Kubokawa & Muni S. Srivastava, 2013. "Optimal Ridge-type Estimators of Covariance Matrix in High Dimension," CIRJE F-Series CIRJE-F-906, CIRJE, Faculty of Economics, University of Tokyo.
    7. Arnab Chakrabarti & Rituparna Sen, 2018. "Some Statistical Problems with High Dimensional Financial data," Papers 1808.02953, arXiv.org.
    8. Ishii, Aki & Yata, Kazuyoshi & Aoshima, Makoto, 2022. "Geometric classifiers for high-dimensional noisy data," Journal of Multivariate Analysis, Elsevier, vol. 188(C).
    9. Ruili Sun & Tiefeng Ma & Shuangzhe Liu & Milind Sathye, 2019. "Improved Covariance Matrix Estimation for Portfolio Risk Measurement: A Review," JRFM, MDPI, vol. 12(1), pages 1-34, March.
    10. Makoto Aoshima & Kazuyoshi Yata, 2014. "A distance-based, misclassification rate adjusted classifier for multiclass, high-dimensional data," Annals of the Institute of Statistical Mathematics, Springer;The Institute of Statistical Mathematics, vol. 66(5), pages 983-1010, October.
    11. Ikeda, Yuki & Kubokawa, Tatsuya & Srivastava, Muni S., 2016. "Comparison of linear shrinkage estimators of a large covariance matrix in normal and non-normal distributions," Computational Statistics & Data Analysis, Elsevier, vol. 95(C), pages 95-108.
    12. Tsubasa Ito & Tatsuya Kubokawa, 2015. "Linear Ridge Estimator of High-Dimensional Precision Matrix Using Random Matrix Theory ," CIRJE F-Series CIRJE-F-995, CIRJE, Faculty of Economics, University of Tokyo.
    13. Jianqing Fan & Yuan Liao & Martina Mincheva, 2013. "Large covariance estimation by thresholding principal orthogonal complements," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 75(4), pages 603-680, September.
    14. Tatsuya Kubokawa & Akira Inoue, 2012. "Estimation of Covariance and Precision Matrices in High Dimension," CIRJE F-Series CIRJE-F-855, CIRJE, Faculty of Economics, University of Tokyo.
    15. Yuki Ikeda & Tatsuya Kubokawa & Muni S. Srivastava, 2015. "Comparison of Linear Shrinkage Estimators of a Large Covariance Matrix in Normal and Non-normal Distributions," CIRJE F-Series CIRJE-F-970, CIRJE, Faculty of Economics, University of Tokyo.
    16. Yata, Kazuyoshi & Aoshima, Makoto, 2013. "Correlation tests for high-dimensional data using extended cross-data-matrix methodology," Journal of Multivariate Analysis, Elsevier, vol. 117(C), pages 313-331.
    17. Hannart, Alexis & Naveau, Philippe, 2014. "Estimating high dimensional covariance matrices: A new look at the Gaussian conjugate framework," Journal of Multivariate Analysis, Elsevier, vol. 131(C), pages 149-162.
    18. Jung, Sungkyu & Sen, Arusharka & Marron, J.S., 2012. "Boundary behavior in High Dimension, Low Sample Size asymptotics of PCA," Journal of Multivariate Analysis, Elsevier, vol. 109(C), pages 190-203.
    19. Yang, Guangren & Liu, Yiming & Pan, Guangming, 2019. "Weighted covariance matrix estimation," Computational Statistics & Data Analysis, Elsevier, vol. 139(C), pages 82-98.
    20. Ledoit, Olivier & Wolf, Michael, 2017. "Numerical implementation of the QuEST function," Computational Statistics & Data Analysis, Elsevier, vol. 115(C), pages 199-223.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:compst:v:38:y:2023:i:3:d:10.1007_s00180-022-01277-6. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.