A sparse eigen-decomposition estimation in semiparametric regression
AbstractFor semiparametric models, one of the key issues is to reduce the predictors' dimension so that the regression functions can be efficiently estimated based on the low-dimensional projections of the original predictors. Many sufficient dimension reduction methods seek such principal projections by conducting the eigen-decomposition technique on some method-specific candidate matrices. In this paper, we propose a sparse eigen-decomposition strategy by shrinking small sample eigenvalues to zero. Different from existing methods, the new method can simultaneously estimate basis directions and structural dimension of the central (mean) subspace in a data-driven manner. The oracle property of our estimation procedure is also established. Comprehensive simulations and a real data application are reported to illustrate the efficacy of the new proposed method.
Download InfoIf you experience problems downloading a file, check if you have the proper application to view it first. In case of further problems read the IDEAS help page. Note that these files are not on the IDEAS site. Please be patient as the files may be large.
As the access to this document is restricted, you may want to look for a different version under "Related research" (further below) or search for a different version of it.
Bibliographic InfoArticle provided by Elsevier in its journal Computational Statistics & Data Analysis.
Volume (Year): 54 (2010)
Issue (Month): 4 (April)
Contact details of provider:
Web page: http://www.elsevier.com/locate/csda
Please report citation or reference errors to , or , if you are the registered author of the cited work, log in to your RePEc Author Service profile, click on "citations" and make appropriate adjustments.:
- Li, Bing & Wang, Shaoli, 2007. "On Directional Regression for Dimension Reduction," Journal of the American Statistical Association, American Statistical Association, vol. 102, pages 997-1008, September.
- Peng Zeng, 2008. "Determining the dimension of the central subspace and central mean subspace," Biometrika, Biometrika Trust, vol. 95(2), pages 469-479.
- Zhu, Li-Ping & Zhu, Li-Xing, 2007. "On kernel method for sliced average variance estimation," Journal of Multivariate Analysis, Elsevier, vol. 98(5), pages 970-991, May.
- Zou, Hui, 2006. "The Adaptive Lasso and Its Oracle Properties," Journal of the American Statistical Association, American Statistical Association, vol. 101, pages 1418-1429, December.
- Zhu, Yu & Zeng, Peng, 2006. "Fourier Methods for Estimating the Central Subspace and the Central Mean Subspace in Regression," Journal of the American Statistical Association, American Statistical Association, vol. 101, pages 1638-1651, December.
- Yin, Xiangrong & Li, Bing & Cook, R. Dennis, 2008. "Successive direction extraction for estimating the central subspace in a multiple-index regression," Journal of Multivariate Analysis, Elsevier, vol. 99(8), pages 1733-1757, September.
- Fan J. & Li R., 2001. "Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties," Journal of the American Statistical Association, American Statistical Association, vol. 96, pages 1348-1360, December.
- Cook, R. Dennis & Ni, Liqiang, 2005. "Sufficient Dimension Reduction via Inverse Regression: A Minimum Discrepancy Approach," Journal of the American Statistical Association, American Statistical Association, vol. 100, pages 410-428, June.
- Hansheng Wang & Runze Li & Chih-Ling Tsai, 2007. "Tuning parameter selectors for the smoothly clipped absolute deviation method," Biometrika, Biometrika Trust, vol. 94(3), pages 553-568.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: (Wendy Shamier).
If references are entirely missing, you can add them using this form.