Regression coefficient and autoregressive order shrinkage and selection via the lasso
AbstractThe "least absolute shrinkage and selection operator" ('lasso') has been widely used in regression shrinkage and selection. We extend its application to the regression model with autoregressive errors. Two types of lasso estimators are carefully studied. The first is similar to the traditional lasso estimator with only two tuning parameters (one for regression coefficients and the other for autoregression coefficients). These tuning parameters can be easily calculated via a data-driven method, but the resulting lasso estimator may not be fully efficient. To overcome this limitation, we propose a second lasso estimator which uses different tuning parameters for each coefficient. We show that this modified lasso can produce the estimator as efficiently as the "oracle". Moreover, we propose an algorithm for tuning parameter estimates to obtain the modified lasso estimator. Simulation studies demonstrate that the modified estimator is superior to the traditional estimator. One empirical example is also presented to illustrate the usefulness of lasso estimators. The extension of the lasso to the autoregression with exogenous variables model is briefly discussed. Copyright 2007 Royal Statistical Society.
Download InfoIf you experience problems downloading a file, check if you have the proper application to view it first. In case of further problems read the IDEAS help page. Note that these files are not on the IDEAS site. Please be patient as the files may be large.
As the access to this document is restricted, you may want to look for a different version under "Related research" (further below) or search for a different version of it.
Bibliographic InfoArticle provided by Royal Statistical Society in its journal Journal of the Royal Statistical Society: Series B (Statistical Methodology).
Volume (Year): 69 (2007)
Issue (Month): 1 ()
Contact details of provider:
Postal: 12 Errol Street, London EC1Y 8LX, United Kingdom
Web page: http://www.blackwellpublishing.com/journal.asp?ref=1369-7412
More information through EDIRC
You can help add them by filling out this form.
CitEc Project, subscribe to its RSS feed for this item.
- Anders Bredahl Kock, 2012. "On the Oracle Property of the Adaptive Lasso in Stationary and Nonstationary Autoregressions," CREATES Research Papers 2012-05, School of Economics and Management, University of Aarhus.
- Pötscher, Benedikt M. & Schneider, Ulrike, 2007. "On the distribution of the adaptive LASSO estimator," MPRA Paper 6913, University Library of Munich, Germany.
- Hsu, Nan-Jung & Hung, Hung-Lin & Chang, Ya-Mei, 2008. "Subset selection for vector autoregressive processes using Lasso," Computational Statistics & Data Analysis, Elsevier, vol. 52(7), pages 3645-3657, March.
- Marcelo C. Medeiros & Eduardo F. Mendes, 2012.
"Estimating High-Dimensional Time Series Models,"
CREATES Research Papers
2012-37, School of Economics and Management, University of Aarhus.
- Søren Johansen & Marco Riani & Anthony C. Atkinson, 2012.
"The Selection of ARIMA Models with or without Regressors,"
CREATES Research Papers
2012-46, School of Economics and Management, University of Aarhus.
- Søren Johansen & Marco Riani & Anthony C. Atkinson, 2012. "The Selection of ARIMA Models with or without Regressors," Discussion Papers 12-17, University of Copenhagen. Department of Economics.
- Ding, Huijuan & Claeskens, Gerda, 2008. "Variable selection in partially linear wavelet models," Open Access publications from Katholieke Universiteit Leuven urn:hdl:123456789/209060, Katholieke Universiteit Leuven.
- Anders Bredahl Kock & Laurent A.F. Callot, 2012. "Oracle Efficient Estimation and Forecasting with the Adaptive LASSO and the Adaptive Group LASSO in Vector Autoregressions," CREATES Research Papers 2012-38, School of Economics and Management, University of Aarhus.
- Wang, Hansheng & Leng, Chenlei, 2008. "A note on adaptive group lasso," Computational Statistics & Data Analysis, Elsevier, vol. 52(12), pages 5277-5286, August.
- Leng, Chenlei & Li, Bo, 2010. "Least squares approximation with a diverging number of parameters," Statistics & Probability Letters, Elsevier, vol. 80(3-4), pages 254-261, February.
- Nardi, Y. & Rinaldo, A., 2011. "Autoregressive process modeling via the Lasso procedure," Journal of Multivariate Analysis, Elsevier, vol. 102(3), pages 528-549, March.
- Zheng, Shurong, 2008. "Selection of components and degrees of smoothing via lasso in high dimensional nonparametric additive models," Computational Statistics & Data Analysis, Elsevier, vol. 53(1), pages 164-175, September.
- Pötscher, Benedikt M., 2007. "Confidence Sets Based on Sparse Estimators Are Necessarily Large," MPRA Paper 5677, University Library of Munich, Germany.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: (Wiley-Blackwell Digital Licensing) or (Christopher F. Baum).
If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.
If references are entirely missing, you can add them using this form.
If the full references list an item that is present in RePEc, but the system did not link to it, you can help with this form.
If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your profile, as there may be some citations waiting for confirmation.
Please note that corrections may take a couple of weeks to filter through the various RePEc services.