IDEAS home Printed from https://ideas.repec.org/a/eee/csdana/v52y2008i9p4512-4520.html
   My bibliography  Save this article

A nonlinear multi-dimensional variable selection method for high dimensional data: Sparse MAVE

Author

Listed:
  • Wang, Qin
  • Yin, Xiangrong

Abstract

Traditional variable selection methods are model based and may suffer from possible model misspecification. On the other hand, sufficient dimension reduction provides us with a way to find sufficient dimensions without a parametric model. However, the drawback is that each reduced variable is a linear combination of all the original variables, which may be difficult to interpret. In this paper, focusing on the sufficient dimensions in the regression mean function, we combine the ideas of sufficient dimension reduction and variable selection to propose a shrinkage estimation method, sparse MAVE. The sparse MAVE can exhaustively estimate dimensions in the mean function, while selecting informative covariates simultaneously without assuming any particular model or particular distribution on the predictor variables. Furthermore, we propose a modified BIC criterion for effectively estimating the dimension of the mean function. The efficacy of sparse MAVE is verified through simulation studies and via analysis of a real data set.

Suggested Citation

  • Wang, Qin & Yin, Xiangrong, 2008. "A nonlinear multi-dimensional variable selection method for high dimensional data: Sparse MAVE," Computational Statistics & Data Analysis, Elsevier, vol. 52(9), pages 4512-4520, May.
  • Handle: RePEc:eee:csdana:v:52:y:2008:i:9:p:4512-4520
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0167-9473(08)00159-X
    Download Restriction: Full text for ScienceDirect subscribers only.
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Liqiang Ni & R. Dennis Cook & Chih-Ling Tsai, 2005. "A note on shrinkage sliced inverse regression," Biometrika, Biometrika Trust, vol. 92(1), pages 242-247, March.
    2. Yingcun Xia & Howell Tong & W. K. Li & Li‐Xing Zhu, 2002. "An adaptive estimation of dimension reduction space," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 64(3), pages 363-410, August.
    3. Zhu, Lixing & Miao, Baiqi & Peng, Heng, 2006. "On Sliced Inverse Regression With High-Dimensional Covariates," Journal of the American Statistical Association, American Statistical Association, vol. 101, pages 630-643, June.
    4. Lexin Li & R. Dennis Cook & Christopher J. Nachtsheim, 2005. "Model‐free variable selection," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 67(2), pages 285-299, April.
    5. D. R. Cox & E. J. Snell, 1974. "The Choice of Variables in Observational Studies," Journal of the Royal Statistical Society Series C, Royal Statistical Society, vol. 23(1), pages 51-59, March.
    6. Lexin Li, 2007. "Sparse sufficient dimension reduction," Biometrika, Biometrika Trust, vol. 94(3), pages 603-613.
    7. Ye Z. & Weiss R.E., 2003. "Using the Bootstrap to Select One of a New Class of Dimension Reduction Methods," Journal of the American Statistical Association, American Statistical Association, vol. 98, pages 968-979, January.
    8. Hui Zou & Trevor Hastie, 2005. "Addendum: Regularization and variable selection via the elastic net," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 67(5), pages 768-768, November.
    9. Prasad A. Naik & Chih-Ling Tsai, 2005. "Constrained Inverse Regression for Incorporating Prior Information," Journal of the American Statistical Association, American Statistical Association, vol. 100, pages 204-211, March.
    10. Hui Zou & Trevor Hastie, 2005. "Regularization and variable selection via the elastic net," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 67(2), pages 301-320, April.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Radchenko, Peter, 2015. "High dimensional single index models," Journal of Multivariate Analysis, Elsevier, vol. 139(C), pages 266-282.
    2. Yunquan Song & Zitong Li & Minglu Fang, 2022. "Robust Variable Selection Based on Penalized Composite Quantile Regression for High-Dimensional Single-Index Models," Mathematics, MDPI, vol. 10(12), pages 1-17, June.
    3. Pircalabelu, Eugen & Artemiou, Andreas, 2021. "Graph informed sliced inverse regression," Computational Statistics & Data Analysis, Elsevier, vol. 164(C).
    4. Paris, Quentin, 2014. "Minimax adaptive dimension reduction for regression," Journal of Multivariate Analysis, Elsevier, vol. 128(C), pages 186-202.
    5. Zhang, Hong-Fan, 2021. "Minimum Average Variance Estimation with group Lasso for the multivariate response Central Mean Subspace," Journal of Multivariate Analysis, Elsevier, vol. 184(C).
    6. Zongwu Cai & Ying Fang & Ming Lin & Zixuan Wu, 2023. "A Quasi Synthetic Control Method for Nonlinear Models With High-Dimensional Covariates," WORKING PAPERS SERIES IN THEORETICAL AND APPLIED ECONOMICS 202305, University of Kansas, Department of Economics, revised Aug 2023.
    7. Zifang Guo & Lexin Li & Wenbin Lu & Bing Li, 2015. "Groupwise Dimension Reduction via Envelope Method," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 110(512), pages 1515-1527, December.
    8. Yang, Jing & Tian, Guoliang & Lu, Fang & Lu, Xuewen, 2020. "Single-index modal regression via outer product gradients," Computational Statistics & Data Analysis, Elsevier, vol. 144(C).
    9. Wang, Pei & Yin, Xiangrong & Yuan, Qingcong & Kryscio, Richard, 2021. "Feature filter for estimating central mean subspace and its sparse solution," Computational Statistics & Data Analysis, Elsevier, vol. 163(C).
    10. Rekabdarkolaee, Hossein Moradi & Boone, Edward & Wang, Qin, 2017. "Robust estimation and variable selection in sufficient dimension reduction," Computational Statistics & Data Analysis, Elsevier, vol. 108(C), pages 146-157.
    11. Wang, Qin & Yin, Xiangrong, 2008. "Sufficient dimension reduction and variable selection for regression mean function with two types of predictors," Statistics & Probability Letters, Elsevier, vol. 78(16), pages 2798-2803, November.
    12. Wang, Qin & Xue, Yuan, 2021. "An ensemble of inverse moment estimators for sufficient dimension reduction," Computational Statistics & Data Analysis, Elsevier, vol. 161(C).
    13. Xu Guo & Tao Wang & Lixing Zhu, 2016. "Model checking for parametric single-index models: a dimension reduction model-adaptive approach," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 78(5), pages 1013-1035, November.
    14. Wang, Qin & Yao, Weixin, 2012. "An adaptive estimation of MAVE," Journal of Multivariate Analysis, Elsevier, vol. 104(1), pages 88-100, February.
    15. Moradi Rekabdarkolaee, Hossein & Wang, Qin, 2017. "Variable selection through adaptive MAVE," Statistics & Probability Letters, Elsevier, vol. 128(C), pages 44-51.
    16. Yuan Xue & Xiangrong Yin, 2015. "Sufficient dimension folding for a functional of conditional distribution of matrix- or array-valued objects," Journal of Nonparametric Statistics, Taylor & Francis Journals, vol. 27(2), pages 253-269, June.
    17. Changrong Yan & Dixin Zhang, 2013. "Sparse dimension reduction for survival data," Computational Statistics, Springer, vol. 28(4), pages 1835-1852, August.
    18. Yao, Weixin & Wang, Qin, 2013. "Robust variable selection through MAVE," Computational Statistics & Data Analysis, Elsevier, vol. 63(C), pages 42-49.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Wang, Tao & Zhu, Lixing, 2013. "Sparse sufficient dimension reduction using optimal scoring," Computational Statistics & Data Analysis, Elsevier, vol. 57(1), pages 223-232.
    2. Yao, Weixin & Wang, Qin, 2013. "Robust variable selection through MAVE," Computational Statistics & Data Analysis, Elsevier, vol. 63(C), pages 42-49.
    3. Bilin Zeng & Xuerong Meggie Wen & Lixing Zhu, 2017. "A link-free sparse group variable selection method for single-index model," Journal of Applied Statistics, Taylor & Francis Journals, vol. 44(13), pages 2388-2400, October.
    4. Weng, Jiaying, 2022. "Fourier transform sparse inverse regression estimators for sufficient variable selection," Computational Statistics & Data Analysis, Elsevier, vol. 168(C).
    5. Moradi Rekabdarkolaee, Hossein & Wang, Qin, 2017. "Variable selection through adaptive MAVE," Statistics & Probability Letters, Elsevier, vol. 128(C), pages 44-51.
    6. Changrong Yan & Dixin Zhang, 2013. "Sparse dimension reduction for survival data," Computational Statistics, Springer, vol. 28(4), pages 1835-1852, August.
    7. Hilafu, Haileab & Yin, Xiangrong, 2013. "Sufficient dimension reduction in multivariate regressions with categorical predictors," Computational Statistics & Data Analysis, Elsevier, vol. 63(C), pages 139-147.
    8. Wang, Qin & Yin, Xiangrong, 2008. "Sufficient dimension reduction and variable selection for regression mean function with two types of predictors," Statistics & Probability Letters, Elsevier, vol. 78(16), pages 2798-2803, November.
    9. Zifang Guo & Lexin Li & Wenbin Lu & Bing Li, 2015. "Groupwise Dimension Reduction via Envelope Method," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 110(512), pages 1515-1527, December.
    10. Heng-Hui Lue & Bing-Ran You, 2013. "High-dimensional regression analysis with treatment comparisons," Computational Statistics, Springer, vol. 28(3), pages 1299-1317, June.
    11. Yang Liu & Francesca Chiaromonte & Bing Li, 2017. "Structured Ordinary Least Squares: A Sufficient Dimension Reduction approach for regressions with partitioned predictors and heterogeneous units," Biometrics, The International Biometric Society, vol. 73(2), pages 529-539, June.
    12. Zhou, Jingke & Zhu, Lixing, 2016. "Principal minimax support vector machine for sufficient dimension reduction with contaminated data," Computational Statistics & Data Analysis, Elsevier, vol. 94(C), pages 33-48.
    13. Kim, Kyongwon, 2022. "On principal graphical models with application to gene network," Computational Statistics & Data Analysis, Elsevier, vol. 166(C).
    14. Hojin Yang & Hongtu Zhu & Joseph G. Ibrahim, 2018. "MILFM: Multiple index latent factor model based on high‐dimensional features," Biometrics, The International Biometric Society, vol. 74(3), pages 834-844, September.
    15. Wang, Pei & Yin, Xiangrong & Yuan, Qingcong & Kryscio, Richard, 2021. "Feature filter for estimating central mean subspace and its sparse solution," Computational Statistics & Data Analysis, Elsevier, vol. 163(C).
    16. Wang, Qin & Xue, Yuan, 2021. "An ensemble of inverse moment estimators for sufficient dimension reduction," Computational Statistics & Data Analysis, Elsevier, vol. 161(C).
    17. Yunquan Song & Zitong Li & Minglu Fang, 2022. "Robust Variable Selection Based on Penalized Composite Quantile Regression for High-Dimensional Single-Index Models," Mathematics, MDPI, vol. 10(12), pages 1-17, June.
    18. Fang, Fang & Yu, Zhou, 2020. "Model averaging assisted sufficient dimension reduction," Computational Statistics & Data Analysis, Elsevier, vol. 152(C).
    19. Yazhao Lv & Riquan Zhang & Weihua Zhao & Jicai Liu, 2014. "Quantile regression and variable selection for the single-index model," Journal of Applied Statistics, Taylor & Francis Journals, vol. 41(7), pages 1565-1577, July.
    20. Felix Pretis & Lea Schneider & Jason E. Smerdon & David F. Hendry, 2016. "Detecting Volcanic Eruptions In Temperature Reconstructions By Designed Break-Indicator Saturation," Journal of Economic Surveys, Wiley Blackwell, vol. 30(3), pages 403-429, July.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:csdana:v:52:y:2008:i:9:p:4512-4520. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/locate/csda .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.