IDEAS home Printed from https://ideas.repec.org/a/spr/advdac/v12y2018i2d10.1007_s11634-016-0272-8.html
   My bibliography  Save this article

D-trace estimation of a precision matrix using adaptive Lasso penalties

Author

Listed:
  • Vahe Avagyan

    (Ghent University
    Universidad Carlos III de Madrid)

  • Andrés M. Alonso

    (Universidad Carlos III de Madrid)

  • Francisco J. Nogales

    (Universidad Carlos III de Madrid)

Abstract

The accurate estimation of a precision matrix plays a crucial role in the current age of high-dimensional data explosion. To deal with this problem, one of the prominent and commonly used techniques is the $$\ell _1$$ ℓ 1 norm (Lasso) penalization for a given loss function. This approach guarantees the sparsity of the precision matrix estimate for properly selected penalty parameters. However, the $$\ell _1$$ ℓ 1 norm penalization often fails to control the bias of obtained estimator because of its overestimation behavior. In this paper, we introduce two adaptive extensions of the recently proposed $$\ell _1$$ ℓ 1 norm penalized D-trace loss minimization method. They aim at reducing the produced bias in the estimator. Extensive numerical results, using both simulated and real datasets, show the advantage of our proposed estimators.

Suggested Citation

  • Vahe Avagyan & Andrés M. Alonso & Francisco J. Nogales, 2018. "D-trace estimation of a precision matrix using adaptive Lasso penalties," Advances in Data Analysis and Classification, Springer;German Classification Society - Gesellschaft für Klassifikation (GfKl);Japanese Classification Society (JCS);Classification and Data Analysis Group of the Italian Statistical Society (CLADAG);International Federation of Classification Societies (IFCS), vol. 12(2), pages 425-447, June.
  • Handle: RePEc:spr:advdac:v:12:y:2018:i:2:d:10.1007_s11634-016-0272-8
    DOI: 10.1007/s11634-016-0272-8
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s11634-016-0272-8
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s11634-016-0272-8?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Ledoit, Olivier & Wolf, Michael, 2004. "A well-conditioned estimator for large-dimensional covariance matrices," Journal of Multivariate Analysis, Elsevier, vol. 88(2), pages 365-411, February.
    2. Zou, Hui, 2006. "The Adaptive Lasso and Its Oracle Properties," Journal of the American Statistical Association, American Statistical Association, vol. 101, pages 1418-1429, December.
    3. repec:hal:journl:peer-00741629 is not listed on IDEAS
    4. Adam J. Rothman, 2012. "Positive definite estimators of large covariance matrices," Biometrika, Biometrika Trust, vol. 99(3), pages 733-740.
    5. Kourtis, Apostolos & Dotsis, George & Markellos, Raphael N., 2012. "Parameter uncertainty in portfolio selection: Shrinking the inverse covariance matrix," Journal of Banking & Finance, Elsevier, vol. 36(9), pages 2522-2531.
    6. Maurya, Ashwini, 2014. "A joint convex penalty for inverse covariance matrix estimation," Computational Statistics & Data Analysis, Elsevier, vol. 75(C), pages 15-27.
    7. Warton, David I., 2008. "Penalized Normal Likelihood and Ridge Regularization of Correlation and Covariance Matrices," Journal of the American Statistical Association, American Statistical Association, vol. 103, pages 340-349, March.
    8. Banerjee, Sayantan & Ghosal, Subhashis, 2015. "Bayesian structure learning in graphical models," Journal of Multivariate Analysis, Elsevier, vol. 136(C), pages 147-162.
    9. Cui, Ying & Leng, Chenlei & Sun, Defeng, 2016. "Sparse estimation of high-dimensional correlation matrices," Computational Statistics & Data Analysis, Elsevier, vol. 93(C), pages 390-403.
    10. Ming Yuan & Yi Lin, 2007. "Model selection and estimation in the Gaussian graphical model," Biometrika, Biometrika Trust, vol. 94(1), pages 19-35.
    11. Fan J. & Li R., 2001. "Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties," Journal of the American Statistical Association, American Statistical Association, vol. 96, pages 1348-1360, December.
    12. Frahm, Gabriel & Memmel, Christoph, 2010. "Dominating estimators for minimum-variance portfolios," Journal of Econometrics, Elsevier, vol. 159(2), pages 289-302, December.
    13. Lingzhou Xue & Shiqian Ma & Hui Zou, 2012. "Positive-Definite ℓ 1 -Penalized Estimation of Large Covariance Matrices," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 107(500), pages 1480-1491, December.
    14. Schäfer Juliane & Strimmer Korbinian, 2005. "A Shrinkage Approach to Large-Scale Covariance Matrix Estimation and Implications for Functional Genomics," Statistical Applications in Genetics and Molecular Biology, De Gruyter, vol. 4(1), pages 1-32, November.
    15. Lam, Clifford & Fan, Jianqing, 2009. "Sparsistency and rates of convergence in large covariance matrix estimation," LSE Research Online Documents on Economics 31540, London School of Economics and Political Science, LSE Library.
    16. Touloumis, Anestis, 2015. "Nonparametric Stein-type shrinkage covariance matrix estimators in high-dimensional settings," Computational Statistics & Data Analysis, Elsevier, vol. 83(C), pages 251-261.
    17. Yin, Jianxin & Li, Hongzhe, 2013. "Adjusting for high-dimensional covariates in sparse precision matrix estimation by ℓ1-penalization," Journal of Multivariate Analysis, Elsevier, vol. 116(C), pages 365-381.
    18. Teng Zhang & Hui Zou, 2014. "Sparse precision matrix estimation via lasso penalized D-trace loss," Biometrika, Biometrika Trust, vol. 101(1), pages 103-120.
    19. Rothman, Adam J. & Levina, Elizaveta & Zhu, Ji, 2009. "Generalized Thresholding of Large Covariance Matrices," Journal of the American Statistical Association, American Statistical Association, vol. 104(485), pages 177-186.
    20. Goto, Shingo & Xu, Yan, 2015. "Improving Mean Variance Optimization through Sparse Hedging Restrictions," Journal of Financial and Quantitative Analysis, Cambridge University Press, vol. 50(6), pages 1415-1441, December.
    21. Tri-Dzung Nguyen & Roy Welsch, 2010. "Outlier detection and robust covariance estimation using mathematical programming," Advances in Data Analysis and Classification, Springer;German Classification Society - Gesellschaft für Klassifikation (GfKl);Japanese Classification Society (JCS);Classification and Data Analysis Group of the Italian Statistical Society (CLADAG);International Federation of Classification Societies (IFCS), vol. 4(4), pages 301-334, December.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Vahe Avagyan, 2022. "Precision matrix estimation using penalized Generalized Sylvester matrix equation," TEST: An Official Journal of the Spanish Society of Statistics and Operations Research, Springer;Sociedad de Estadística e Investigación Operativa, vol. 31(4), pages 950-967, December.
    2. Fang, Qian & Yu, Chen & Weiping, Zhang, 2020. "Regularized estimation of precision matrix for high-dimensional multivariate longitudinal data," Journal of Multivariate Analysis, Elsevier, vol. 176(C).

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Avagyan, Vahe & Alonso Fernández, Andrés Modesto & Nogales, Francisco J., 2015. "D-trace Precision Matrix Estimation Using Adaptive Lasso Penalties," DES - Working Papers. Statistics and Econometrics. WS 21775, Universidad Carlos III de Madrid. Departamento de Estadística.
    2. Bailey, Natalia & Pesaran, M. Hashem & Smith, L. Vanessa, 2019. "A multiple testing approach to the regularisation of large sample correlation matrices," Journal of Econometrics, Elsevier, vol. 208(2), pages 507-534.
    3. Lam, Clifford, 2020. "High-dimensional covariance matrix estimation," LSE Research Online Documents on Economics 101667, London School of Economics and Political Science, LSE Library.
    4. Ziqi Chen & Chenlei Leng, 2016. "Dynamic Covariance Models," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 111(515), pages 1196-1207, July.
    5. Avagyan, Vahe & Alonso Fernández, Andrés Modesto & Nogales, Francisco J., 2014. "Improving the graphical lasso estimation for the precision matrix through roots ot the sample convariance matrix," DES - Working Papers. Statistics and Econometrics. WS ws141208, Universidad Carlos III de Madrid. Departamento de Estadística.
    6. Avagyan, Vahe, 2016. "D-Trace precision matrix estimator with eigenvalue control," DES - Working Papers. Statistics and Econometrics. WS 23410, Universidad Carlos III de Madrid. Departamento de Estadística.
    7. Cui, Ying & Leng, Chenlei & Sun, Defeng, 2016. "Sparse estimation of high-dimensional correlation matrices," Computational Statistics & Data Analysis, Elsevier, vol. 93(C), pages 390-403.
    8. Yang, Yihe & Zhou, Jie & Pan, Jianxin, 2021. "Estimation and optimal structure selection of high-dimensional Toeplitz covariance matrix," Journal of Multivariate Analysis, Elsevier, vol. 184(C).
    9. Shaoxin Wang & Hu Yang & Chaoli Yao, 2019. "On the penalized maximum likelihood estimation of high-dimensional approximate factor model," Computational Statistics, Springer, vol. 34(2), pages 819-846, June.
    10. Jianqing Fan & Yuan Liao & Han Liu, 2016. "An overview of the estimation of large covariance and precision matrices," Econometrics Journal, Royal Economic Society, vol. 19(1), pages 1-32, February.
    11. Benjamin Poignard & Manabu Asai, 2023. "Estimation of high-dimensional vector autoregression via sparse precision matrix," The Econometrics Journal, Royal Economic Society, vol. 26(2), pages 307-326.
    12. Yan Zhang & Jiyuan Tao & Zhixiang Yin & Guoqiang Wang, 2022. "Improved Large Covariance Matrix Estimation Based on Efficient Convex Combination and Its Application in Portfolio Optimization," Mathematics, MDPI, vol. 10(22), pages 1-15, November.
    13. Bai, Jushan & Liao, Yuan, 2016. "Efficient estimation of approximate factor models via penalized maximum likelihood," Journal of Econometrics, Elsevier, vol. 191(1), pages 1-18.
    14. Gautam Sabnis & Debdeep Pati & Anirban Bhattacharya, 2019. "Compressed Covariance Estimation with Automated Dimension Learning," Sankhya A: The Indian Journal of Statistics, Springer;Indian Statistical Institute, vol. 81(2), pages 466-481, December.
    15. Arnab Chakrabarti & Rituparna Sen, 2018. "Some Statistical Problems with High Dimensional Financial data," Papers 1808.02953, arXiv.org.
    16. Lee, Kyoungjae & Jo, Seongil & Lee, Jaeyong, 2022. "The beta-mixture shrinkage prior for sparse covariances with near-minimax posterior convergence rate," Journal of Multivariate Analysis, Elsevier, vol. 192(C).
    17. Chen, Xin & Yang, Dan & Xu, Yan & Xia, Yin & Wang, Dong & Shen, Haipeng, 2023. "Testing and support recovery of correlation structures for matrix-valued observations with an application to stock market data," Journal of Econometrics, Elsevier, vol. 232(2), pages 544-564.
    18. Chen, Shuo & Kang, Jian & Xing, Yishi & Zhao, Yunpeng & Milton, Donald K., 2018. "Estimating large covariance matrix with network topology for high-dimensional biomedical data," Computational Statistics & Data Analysis, Elsevier, vol. 127(C), pages 82-95.
    19. Ding, Wenliang & Shu, Lianjie & Gu, Xinhua, 2023. "A robust Glasso approach to portfolio selection in high dimensions," Journal of Empirical Finance, Elsevier, vol. 70(C), pages 22-37.
    20. Ikeda, Yuki & Kubokawa, Tatsuya & Srivastava, Muni S., 2016. "Comparison of linear shrinkage estimators of a large covariance matrix in normal and non-normal distributions," Computational Statistics & Data Analysis, Elsevier, vol. 95(C), pages 95-108.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:advdac:v:12:y:2018:i:2:d:10.1007_s11634-016-0272-8. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.