Oracle inequalities for high-dimensional panel data models
This paper is concerned with high-dimensional panel data models where the number of regressors can be much larger than the sample size. Under the assumption that the true parameter vector is sparse we establish finite sample upper bounds on the estimation error of the Lasso under two different sets of conditions on the covariates as well as the error terms. Upper bounds on the estimation error of the unobserved heterogeneity are also provided under the assumption of sparsity. Next, we show that our upper bounds are essentially optimal in the sense that they can only be improved by multiplicative constants. These results are then used to show that the Lasso can be consistent in even very large models where the number of regressors increases at an exponential rate in the sample size. Conditions under which the Lasso does not discard any relevant variables asymptotically are also provided. In the second part of the paper we give lower bounds on the probability with which the adaptive Lasso selects the correct sparsity pattern in finite samples. These results are then used to give conditions under which the adaptive Lasso can detect the correct sparsity pattern asymptotically. We illustrate our finite sample results by simulations and apply the methods to search for covariates explaining growth in the G8 countries.
Please report citation or reference errors to , or , if you are the registered author of the cited work, log in to your RePEc Author Service profile, click on "citations" and make appropriate adjustments.:
- Zou, Hui, 2006. "The Adaptive Lasso and Its Oracle Properties," Journal of the American Statistical Association, American Statistical Association, vol. 101, pages 1418-1429, December.
- Robert J. Barro, 1989.
"Economic Growth in a Cross Section of Countries,"
NBER Working Papers
3120, National Bureau of Economic Research, Inc.
- Jianqing Fan & Jinchi Lv, 2008. "Sure independence screening for ultrahigh dimensional feature space," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 70(5), pages 849-911.
- A. Belloni & V. Chernozhukov & L. Wang, 2011. "Square-root lasso: pivotal recovery of sparse signals via conic programming," Biometrika, Biometrika Trust, vol. 98(4), pages 791-806.
- Fan J. & Li R., 2001. "Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties," Journal of the American Statistical Association, American Statistical Association, vol. 96, pages 1348-1360, December.
- Pötscher, Benedikt M. & Leeb, Hannes, 2007.
"On the distribution of penalized maximum likelihood estimators: The LASSO, SCAD, and thresholding,"
5615, University Library of Munich, Germany.
- Pötscher, Benedikt M. & Leeb, Hannes, 2009. "On the distribution of penalized maximum likelihood estimators: The LASSO, SCAD, and thresholding," Journal of Multivariate Analysis, Elsevier, vol. 100(9), pages 2065-2082, October.
When requesting a correction, please mention this item's handle: RePEc:aah:create:2013-20. See general information about how to correct material in RePEc.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: ()
If references are entirely missing, you can add them using this form.