IDEAS home Printed from https://ideas.repec.org/a/eee/jmvana/v107y2012icp162-168.html
   My bibliography  Save this article

On the border of extreme and mild spiked models in the HDLSS framework

Author

Listed:
  • Lee, Myung Hee

Abstract

In the spiked covariance model for High Dimension Low Sample Size (HDLSS) asymptotics where the dimension tends to infinity while the sample size is fixed, a few largest eigenvalues are assumed to grow as the dimension increases. The rate of growth is crucial as the asymptotic behavior of the sample Principal Component (PC) directions changes dramatically, from consistency to strong inconsistency at the boundary of the extreme and mild spiked covariance models. Yet, the behavior at the boundary spiked model is unexplored. We study the HDLSS asymptotic behavior of the eigenvalues and the eigenvectors of the sample covariance matrix at the boundary spiked model and observe that they show intermediate behavior between the extreme and mild spiked models.

Suggested Citation

  • Lee, Myung Hee, 2012. "On the border of extreme and mild spiked models in the HDLSS framework," Journal of Multivariate Analysis, Elsevier, vol. 107(C), pages 162-168.
  • Handle: RePEc:eee:jmvana:v:107:y:2012:i:c:p:162-168
    DOI: 10.1016/j.jmva.2012.01.003
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0047259X12000048
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.jmva.2012.01.003?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Peter Hall & J. S. Marron & Amnon Neeman, 2005. "Geometric representation of high dimension, low sample size data," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 67(3), pages 427-444, June.
    2. Jeongyoun Ahn & J. S. Marron, 2010. "The maximal data piling direction for discrimination," Biometrika, Biometrika Trust, vol. 97(1), pages 254-259.
    3. Liu, Yufeng & Hayes, David Neil & Nobel, Andrew & Marron, J. S, 2008. "Statistical Significance of Clustering for High-Dimension, Low–Sample Size Data," Journal of the American Statistical Association, American Statistical Association, vol. 103(483), pages 1281-1293.
    4. Li, Baibing & Martin, Elaine B. & Morris, A. Julian, 2002. "On principal component analysis in L1," Computational Statistics & Data Analysis, Elsevier, vol. 40(3), pages 471-474, September.
    5. Baik, Jinho & Silverstein, Jack W., 2006. "Eigenvalues of large sample covariance matrices of spiked population models," Journal of Multivariate Analysis, Elsevier, vol. 97(6), pages 1382-1408, July.
    6. Qiao, Xingye & Zhang, Hao Helen & Liu, Yufeng & Todd, Michael J. & Marron, J. S., 2010. "Weighted Distance Weighted Discrimination and Its Asymptotic Properties," Journal of the American Statistical Association, American Statistical Association, vol. 105(489), pages 401-414.
    7. Fan, Jianqing & Fan, Yingying & Lv, Jinchi, 2008. "High dimensional covariance matrix estimation using a factor model," Journal of Econometrics, Elsevier, vol. 147(1), pages 186-197, November.
    8. Jeongyoun Ahn & J. S. Marron & Keith M. Muller & Yueh-Yun Chi, 2007. "The high-dimension, low-sample-size geometric representation holds under mild conditions," Biometrika, Biometrika Trust, vol. 94(3), pages 760-766.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Jung, Sungkyu & Sen, Arusharka & Marron, J.S., 2012. "Boundary behavior in High Dimension, Low Sample Size asymptotics of PCA," Journal of Multivariate Analysis, Elsevier, vol. 109(C), pages 190-203.
    2. Yugo Nakayama & Kazuyoshi Yata & Makoto Aoshima, 2020. "Bias-corrected support vector machine with Gaussian kernel in high-dimension, low-sample-size settings," Annals of the Institute of Statistical Mathematics, Springer;The Institute of Statistical Mathematics, vol. 72(5), pages 1257-1286, October.
    3. Jianqing Fan & Yuan Liao & Martina Mincheva, 2013. "Large covariance estimation by thresholding principal orthogonal complements," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 75(4), pages 603-680, September.
    4. Jung, Sungkyu, 2018. "Continuum directions for supervised dimension reduction," Computational Statistics & Data Analysis, Elsevier, vol. 125(C), pages 27-43.
    5. Yata, Kazuyoshi & Aoshima, Makoto, 2013. "PCA consistency for the power spiked model in high-dimensional settings," Journal of Multivariate Analysis, Elsevier, vol. 122(C), pages 334-354.
    6. Patrick K. Kimes & Yufeng Liu & David Neil Hayes & James Stephen Marron, 2017. "Statistical significance for hierarchical clustering," Biometrics, The International Biometric Society, vol. 73(3), pages 811-821, September.
    7. Borysov, Petro & Hannig, Jan & Marron, J.S., 2014. "Asymptotics of hierarchical clustering for growing dimension," Journal of Multivariate Analysis, Elsevier, vol. 124(C), pages 465-479.
    8. Chung, Hee Cheol & Ahn, Jeongyoun, 2021. "Subspace rotations for high-dimensional outlier detection," Journal of Multivariate Analysis, Elsevier, vol. 183(C).
    9. Makoto Aoshima & Kazuyoshi Yata, 2014. "A distance-based, misclassification rate adjusted classifier for multiclass, high-dimensional data," Annals of the Institute of Statistical Mathematics, Springer;The Institute of Statistical Mathematics, vol. 66(5), pages 983-1010, October.
    10. Nakayama, Yugo & Yata, Kazuyoshi & Aoshima, Makoto, 2021. "Clustering by principal component analysis with Gaussian kernel in high-dimension, low-sample-size settings," Journal of Multivariate Analysis, Elsevier, vol. 185(C).
    11. Kazuyoshi Yata & Makoto Aoshima, 2020. "Geometric consistency of principal component scores for high‐dimensional mixture models and its application," Scandinavian Journal of Statistics, Danish Society for Theoretical Statistics;Finnish Statistical Society;Norwegian Statistical Association;Swedish Statistical Association, vol. 47(3), pages 899-921, September.
    12. Yata, Kazuyoshi & Aoshima, Makoto, 2012. "Effective PCA for high-dimension, low-sample-size data with noise reduction via geometric representations," Journal of Multivariate Analysis, Elsevier, vol. 105(1), pages 193-215.
    13. Zhao, Junguang & Xu, Xingzhong, 2016. "A generalized likelihood ratio test for normal mean when p is greater than n," Computational Statistics & Data Analysis, Elsevier, vol. 99(C), pages 91-104.
    14. Bolivar-Cime, A. & Marron, J.S., 2013. "Comparison of binary discrimination methods for high dimension low sample size data," Journal of Multivariate Analysis, Elsevier, vol. 115(C), pages 108-121.
    15. Yata, Kazuyoshi & Aoshima, Makoto, 2010. "Effective PCA for high-dimension, low-sample-size data with singular value decomposition of cross data matrix," Journal of Multivariate Analysis, Elsevier, vol. 101(9), pages 2060-2077, October.
    16. Yata, Kazuyoshi & Aoshima, Makoto, 2013. "Correlation tests for high-dimensional data using extended cross-data-matrix methodology," Journal of Multivariate Analysis, Elsevier, vol. 117(C), pages 313-331.
    17. Wang, Shao-Hsuan & Huang, Su-Yun, 2022. "Perturbation theory for cross data matrix-based PCA," Journal of Multivariate Analysis, Elsevier, vol. 190(C).
    18. Kazuyoshi Yata & Makoto Aoshima, 2012. "Inference on High-Dimensional Mean Vectors with Fewer Observations Than the Dimension," Methodology and Computing in Applied Probability, Springer, vol. 14(3), pages 459-476, September.
    19. Joongyeub Yeo & George Papanicolaou, 2016. "Random matrix approach to estimation of high-dimensional factor models," Papers 1611.05571, arXiv.org, revised Nov 2017.
    20. Benaych-Georges, Florent & Nadakuditi, Raj Rao, 2012. "The singular values and vectors of low rank perturbations of large rectangular random matrices," Journal of Multivariate Analysis, Elsevier, vol. 111(C), pages 120-135.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:jmvana:v:107:y:2012:i:c:p:162-168. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/wps/find/journaldescription.cws_home/622892/description#description .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.