IDEAS home Printed from https://ideas.repec.org/a/eee/csdana/v54y2010i4p976-986.html
   My bibliography  Save this article

A sparse eigen-decomposition estimation in semiparametric regression

Author

Listed:
  • Zhu, Li-Ping
  • Yu, Zhou
  • Zhu, Li-Xing

Abstract

For semiparametric models, one of the key issues is to reduce the predictors' dimension so that the regression functions can be efficiently estimated based on the low-dimensional projections of the original predictors. Many sufficient dimension reduction methods seek such principal projections by conducting the eigen-decomposition technique on some method-specific candidate matrices. In this paper, we propose a sparse eigen-decomposition strategy by shrinking small sample eigenvalues to zero. Different from existing methods, the new method can simultaneously estimate basis directions and structural dimension of the central (mean) subspace in a data-driven manner. The oracle property of our estimation procedure is also established. Comprehensive simulations and a real data application are reported to illustrate the efficacy of the new proposed method.

Suggested Citation

  • Zhu, Li-Ping & Yu, Zhou & Zhu, Li-Xing, 2010. "A sparse eigen-decomposition estimation in semiparametric regression," Computational Statistics & Data Analysis, Elsevier, vol. 54(4), pages 976-986, April.
  • Handle: RePEc:eee:csdana:v:54:y:2010:i:4:p:976-986
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0167-9473(09)00381-8
    Download Restriction: Full text for ScienceDirect subscribers only.

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Zou, Hui, 2006. "The Adaptive Lasso and Its Oracle Properties," Journal of the American Statistical Association, American Statistical Association, vol. 101, pages 1418-1429, December.
    2. Cook, R. Dennis & Ni, Liqiang, 2005. "Sufficient Dimension Reduction via Inverse Regression: A Minimum Discrepancy Approach," Journal of the American Statistical Association, American Statistical Association, vol. 100, pages 410-428, June.
    3. Yin, Xiangrong & Li, Bing & Cook, R. Dennis, 2008. "Successive direction extraction for estimating the central subspace in a multiple-index regression," Journal of Multivariate Analysis, Elsevier, vol. 99(8), pages 1733-1757, September.
    4. Zhu, Yu & Zeng, Peng, 2006. "Fourier Methods for Estimating the Central Subspace and the Central Mean Subspace in Regression," Journal of the American Statistical Association, American Statistical Association, vol. 101, pages 1638-1651, December.
    5. Li, Bing & Wang, Shaoli, 2007. "On Directional Regression for Dimension Reduction," Journal of the American Statistical Association, American Statistical Association, vol. 102, pages 997-1008, September.
    6. Zhu, Li-Ping & Zhu, Li-Xing, 2007. "On kernel method for sliced average variance estimation," Journal of Multivariate Analysis, Elsevier, vol. 98(5), pages 970-991, May.
    7. Fan J. & Li R., 2001. "Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties," Journal of the American Statistical Association, American Statistical Association, vol. 96, pages 1348-1360, December.
    8. Hansheng Wang & Runze Li & Chih-Ling Tsai, 2007. "Tuning parameter selectors for the smoothly clipped absolute deviation method," Biometrika, Biometrika Trust, vol. 94(3), pages 553-568.
    9. Peng Zeng, 2008. "Determining the dimension of the central subspace and central mean subspace," Biometrika, Biometrika Trust, vol. 95(2), pages 469-479.
    10. Bura E. & Cook R.D., 2001. "Extending Sliced Inverse Regression: the Weighted Chi-Squared Test," Journal of the American Statistical Association, American Statistical Association, vol. 96, pages 996-1003, September.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Hilafu, Haileab & Yin, Xiangrong, 2013. "Sufficient dimension reduction in multivariate regressions with categorical predictors," Computational Statistics & Data Analysis, Elsevier, vol. 63(C), pages 139-147.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:csdana:v:54:y:2010:i:4:p:976-986. See general information about how to correct material in RePEc.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: (Dana Niculescu). General contact details of provider: http://www.elsevier.com/locate/csda .

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service hosted by the Research Division of the Federal Reserve Bank of St. Louis . RePEc uses bibliographic data supplied by the respective publishers.