Covariance matrix selection and estimation via penalised normal likelihood
AbstractWe propose a nonparametric method for identifying parsimony and for producing a statistically efficient estimator of a large covariance matrix. We reparameterise a covariance matrix through the modified Cholesky decomposition of its inverse or the one-step-ahead predictive representation of the vector of responses and reduce the nonintuitive task of modelling covariance matrices to the familiar task of model selection and estimation for a sequence of regression models. The Cholesky factor containing these regression coefficients is likely to have many off-diagonal elements that are zero or close to zero. Penalised normal likelihoods in this situation with L-sub-1 and L-sub-2 penalities are shown to be closely related to Tibshirani's (1996) LASSO approach and to ridge regression. Adding either penalty to the likelihood helps to produce more stable estimators by introducing shrinkage to the elements in the Cholesky factor, while, because of its singularity, the L-sub-1 penalty will set some elements to zero and produce interpretable models. An algorithm is developed for computing the estimator and selecting the tuning parameter. The proposed maximum penalised likelihood estimator is illustrated using simulation and a real dataset involving estimation of a 102 � 102 covariance matrix. Copyright 2006, Oxford University Press.
Download InfoIf you experience problems downloading a file, check if you have the proper application to view it first. In case of further problems read the IDEAS help page. Note that these files are not on the IDEAS site. Please be patient as the files may be large.
Bibliographic InfoArticle provided by Biometrika Trust in its journal Biometrika.
Volume (Year): 93 (2006)
Issue (Month): 1 (March)
Contact details of provider:
Postal: Oxford University Press, Great Clarendon Street, Oxford OX2 6DP, UK
Fax: 01865 267 985
Web page: http://biomet.oxfordjournals.org/
You can help add them by filling out this form.
CitEc Project, subscribe to its RSS feed for this item.
- Fan, Jianqing & Fan, Yingying & Lv, Jinchi, 2008. "High dimensional covariance matrix estimation using a factor model," Journal of Econometrics, Elsevier, vol. 147(1), pages 186-197, November.
- Verzelen, N. & Villers, F., 2009. "Tests for Gaussian graphical models," Computational Statistics & Data Analysis, Elsevier, vol. 53(5), pages 1894-1905, March.
- Xue, Lingzhou & Zou, Hui, 2013. "Minimax optimal estimation of general bandable covariance matrices," Journal of Multivariate Analysis, Elsevier, vol. 116(C), pages 45-51.
- Xi Luo, 2011. "Recovering Model Structures from Large Low Rank and Sparse Covariance Matrix Estimation," Papers 1111.1133, arXiv.org, revised Mar 2013.
- Pesaran, M. Hashem & Yamagata, Takashi, 2012.
"Testing CAPM with a Large Number of Assets,"
IZA Discussion Papers
6469, Institute for the Study of Labor (IZA).
- Song Liu & Yuhong Yang, 2012. "Combining models in longitudinal data analysis," Annals of the Institute of Statistical Mathematics, Springer, vol. 64(2), pages 233-254, April.
- Chen, Songxi, 2012. "Two Sample Tests for High Dimensional Covariance Matrices," MPRA Paper 46026, University Library of Munich, Germany.
- Pesaran, M. H. & Yamagata, T., 2012. "Testing CAPM with a Large Number of Assets (Updated 28th March 2012)," Cambridge Working Papers in Economics 1210, Faculty of Economics, University of Cambridge.
- Fisher, Thomas J. & Sun, Xiaoqian, 2011. "Improved Stein-type shrinkage estimators for the high-dimensional multivariate normal covariance matrix," Computational Statistics & Data Analysis, Elsevier, vol. 55(5), pages 1909-1918, May.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: (Oxford University Press) or (Christopher F. Baum).
If references are entirely missing, you can add them using this form.