IDEAS home Printed from https://ideas.repec.org/p/cte/wsrepe/21775.html
   My bibliography  Save this paper

D-trace Precision Matrix Estimation Using Adaptive Lasso Penalties

Author

Listed:
  • Avagyan, Vahe
  • Alonso Fernández, Andrés Modesto
  • Nogales, Francisco J.

Abstract

An accurate estimation of a precision matrix has a crucial role in the current age of high-dimensional data explosion. To deal with this problem, one of the prominent and commonly used techniques is the l1 norm (Lasso) penalization for a given loss function. This approach guarantees the sparsity of the precision matrix estimator for properly selected penalty parameters. However, the l1 norm penalization often fails to control the bias of the obtained estimator because of its overestimation behavior. In this paper, we introduce two adaptive extensions of the recently proposed l1 norm penalized D-trace loss minimization method. The proposed approaches intend to diminish the produced bias in the estimator. Extensive numerical results, using both simulated and real datasets, show the advantage of our proposed estimators.

Suggested Citation

  • Avagyan, Vahe & Alonso Fernández, Andrés Modesto & Nogales, Francisco J., 2015. "D-trace Precision Matrix Estimation Using Adaptive Lasso Penalties," DES - Working Papers. Statistics and Econometrics. WS 21775, Universidad Carlos III de Madrid. Departamento de Estadística.
  • Handle: RePEc:cte:wsrepe:21775
    as

    Download full text from publisher

    File URL: https://e-archivo.uc3m.es/bitstream/handle/10016/21775/ws1520.pdf?sequence=1
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. Frahm, Gabriel & Memmel, Christoph, 2010. "Dominating estimators for minimum-variance portfolios," Journal of Econometrics, Elsevier, vol. 159(2), pages 289-302, December.
    2. Ledoit, Olivier & Wolf, Michael, 2004. "A well-conditioned estimator for large-dimensional covariance matrices," Journal of Multivariate Analysis, Elsevier, vol. 88(2), pages 365-411, February.
    3. Zou, Hui, 2006. "The Adaptive Lasso and Its Oracle Properties," Journal of the American Statistical Association, American Statistical Association, vol. 101, pages 1418-1429, December.
    4. repec:hal:journl:peer-00741629 is not listed on IDEAS
    5. Adam J. Rothman, 2012. "Positive definite estimators of large covariance matrices," Biometrika, Biometrika Trust, vol. 99(3), pages 733-740.
    6. Kourtis, Apostolos & Dotsis, George & Markellos, Raphael N., 2012. "Parameter uncertainty in portfolio selection: Shrinking the inverse covariance matrix," Journal of Banking & Finance, Elsevier, vol. 36(9), pages 2522-2531.
    7. Schäfer Juliane & Strimmer Korbinian, 2005. "A Shrinkage Approach to Large-Scale Covariance Matrix Estimation and Implications for Functional Genomics," Statistical Applications in Genetics and Molecular Biology, De Gruyter, vol. 4(1), pages 1-32, November.
    8. Maurya, Ashwini, 2014. "A joint convex penalty for inverse covariance matrix estimation," Computational Statistics & Data Analysis, Elsevier, vol. 75(C), pages 15-27.
    9. Warton, David I., 2008. "Penalized Normal Likelihood and Ridge Regularization of Correlation and Covariance Matrices," Journal of the American Statistical Association, American Statistical Association, vol. 103, pages 340-349, March.
    10. Banerjee, Sayantan & Ghosal, Subhashis, 2015. "Bayesian structure learning in graphical models," Journal of Multivariate Analysis, Elsevier, vol. 136(C), pages 147-162.
    11. Touloumis, Anestis, 2015. "Nonparametric Stein-type shrinkage covariance matrix estimators in high-dimensional settings," Computational Statistics & Data Analysis, Elsevier, vol. 83(C), pages 251-261.
    12. Yin, Jianxin & Li, Hongzhe, 2013. "Adjusting for high-dimensional covariates in sparse precision matrix estimation by ℓ1-penalization," Journal of Multivariate Analysis, Elsevier, vol. 116(C), pages 365-381.
    13. Teng Zhang & Hui Zou, 2014. "Sparse precision matrix estimation via lasso penalized D-trace loss," Biometrika, Biometrika Trust, vol. 101(1), pages 103-120.
    14. Rothman, Adam J. & Levina, Elizaveta & Zhu, Ji, 2009. "Generalized Thresholding of Large Covariance Matrices," Journal of the American Statistical Association, American Statistical Association, vol. 104(485), pages 177-186.
    15. Ming Yuan & Yi Lin, 2007. "Model selection and estimation in the Gaussian graphical model," Biometrika, Biometrika Trust, vol. 94(1), pages 19-35.
    16. Tri-Dzung Nguyen & Roy Welsch, 2010. "Outlier detection and robust covariance estimation using mathematical programming," Advances in Data Analysis and Classification, Springer;German Classification Society - Gesellschaft für Klassifikation (GfKl);Japanese Classification Society (JCS);Classification and Data Analysis Group of the Italian Statistical Society (CLADAG);International Federation of Classification Societies (IFCS), vol. 4(4), pages 301-334, December.
    17. Lingzhou Xue & Shiqian Ma & Hui Zou, 2012. "Positive-Definite ℓ 1 -Penalized Estimation of Large Covariance Matrices," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 107(500), pages 1480-1491, December.
    18. Cai, Tony & Liu, Weidong & Luo, Xi, 2011. "A Constrained â„“1 Minimization Approach to Sparse Precision Matrix Estimation," Journal of the American Statistical Association, American Statistical Association, vol. 106(494), pages 594-607.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Vahe Avagyan & Andrés M. Alonso & Francisco J. Nogales, 2018. "D-trace estimation of a precision matrix using adaptive Lasso penalties," Advances in Data Analysis and Classification, Springer;German Classification Society - Gesellschaft für Klassifikation (GfKl);Japanese Classification Society (JCS);Classification and Data Analysis Group of the Italian Statistical Society (CLADAG);International Federation of Classification Societies (IFCS), vol. 12(2), pages 425-447, June.
    2. Bailey, Natalia & Pesaran, M. Hashem & Smith, L. Vanessa, 2019. "A multiple testing approach to the regularisation of large sample correlation matrices," Journal of Econometrics, Elsevier, vol. 208(2), pages 507-534.
    3. Avagyan, Vahe & Alonso Fernández, Andrés Modesto & Nogales, Francisco J., 2014. "Improving the graphical lasso estimation for the precision matrix through roots ot the sample convariance matrix," DES - Working Papers. Statistics and Econometrics. WS ws141208, Universidad Carlos III de Madrid. Departamento de Estadística.
    4. Avagyan, Vahe, 2016. "D-Trace precision matrix estimator with eigenvalue control," DES - Working Papers. Statistics and Econometrics. WS 23410, Universidad Carlos III de Madrid. Departamento de Estadística.
    5. Lam, Clifford, 2020. "High-dimensional covariance matrix estimation," LSE Research Online Documents on Economics 101667, London School of Economics and Political Science, LSE Library.
    6. Ziqi Chen & Chenlei Leng, 2016. "Dynamic Covariance Models," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 111(515), pages 1196-1207, July.
    7. Yan Zhang & Jiyuan Tao & Zhixiang Yin & Guoqiang Wang, 2022. "Improved Large Covariance Matrix Estimation Based on Efficient Convex Combination and Its Application in Portfolio Optimization," Mathematics, MDPI, vol. 10(22), pages 1-15, November.
    8. Arnab Chakrabarti & Rituparna Sen, 2018. "Some Statistical Problems with High Dimensional Financial data," Papers 1808.02953, arXiv.org.
    9. Cui, Ying & Leng, Chenlei & Sun, Defeng, 2016. "Sparse estimation of high-dimensional correlation matrices," Computational Statistics & Data Analysis, Elsevier, vol. 93(C), pages 390-403.
    10. Yang, Yihe & Dai, Hongsheng & Pan, Jianxin, 2023. "Block-diagonal precision matrix regularization for ultra-high dimensional data," Computational Statistics & Data Analysis, Elsevier, vol. 179(C).
    11. Choi, Young-Geun & Lim, Johan & Roy, Anindya & Park, Junyong, 2019. "Fixed support positive-definite modification of covariance matrix estimators via linear shrinkage," Journal of Multivariate Analysis, Elsevier, vol. 171(C), pages 234-249.
    12. Jianqing Fan & Yuan Liao & Han Liu, 2016. "An overview of the estimation of large covariance and precision matrices," Econometrics Journal, Royal Economic Society, vol. 19(1), pages 1-32, February.
    13. Gautam Sabnis & Debdeep Pati & Anirban Bhattacharya, 2019. "Compressed Covariance Estimation with Automated Dimension Learning," Sankhya A: The Indian Journal of Statistics, Springer;Indian Statistical Institute, vol. 81(2), pages 466-481, December.
    14. Huang, Na & Fryzlewicz, Piotr, 2018. "NOVELIST estimator of large correlation and covariance matrices and their inverses," LSE Research Online Documents on Economics 89055, London School of Economics and Political Science, LSE Library.
    15. Lee, Kyoungjae & Jo, Seongil & Lee, Jaeyong, 2022. "The beta-mixture shrinkage prior for sparse covariances with near-minimax posterior convergence rate," Journal of Multivariate Analysis, Elsevier, vol. 192(C).
    16. Chen, Xin & Yang, Dan & Xu, Yan & Xia, Yin & Wang, Dong & Shen, Haipeng, 2023. "Testing and support recovery of correlation structures for matrix-valued observations with an application to stock market data," Journal of Econometrics, Elsevier, vol. 232(2), pages 544-564.
    17. Azam Kheyri & Andriette Bekker & Mohammad Arashi, 2022. "High-Dimensional Precision Matrix Estimation through GSOS with Application in the Foreign Exchange Market," Mathematics, MDPI, vol. 10(22), pages 1-19, November.
    18. Ruili Sun & Tiefeng Ma & Shuangzhe Liu & Milind Sathye, 2019. "Improved Covariance Matrix Estimation for Portfolio Risk Measurement: A Review," JRFM, MDPI, vol. 12(1), pages 1-34, March.
    19. Chen, Shuo & Kang, Jian & Xing, Yishi & Zhao, Yunpeng & Milton, Donald K., 2018. "Estimating large covariance matrix with network topology for high-dimensional biomedical data," Computational Statistics & Data Analysis, Elsevier, vol. 127(C), pages 82-95.
    20. Pan, Yuqing & Mai, Qing, 2020. "Efficient computation for differential network analysis with applications to quadratic discriminant analysis," Computational Statistics & Data Analysis, Elsevier, vol. 144(C).

    More about this item

    Keywords

    Gaussian Graphical Model;

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:cte:wsrepe:21775. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Ana Poveda (email available below). General contact details of provider: http://portal.uc3m.es/portal/page/portal/dpto_estadistica .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.