IDEAS home Printed from https://ideas.repec.org/a/eee/csdana/v55y2011i4p1656-1664.html
   My bibliography  Save this article

Estimation of inverse mean: An orthogonal series approach

Author

Listed:
  • Wang, Qin
  • Yin, Xiangrong

Abstract

In this article, we propose the use of orthogonal series to estimate the inverse mean space. Compared to the original slicing scheme, it significantly improves the estimation accuracy without losing computation efficiency, especially for the heteroscedastic models. Compared to the local smoothing approach, it is more computationally efficient. The new approach also has the advantage of robustness in selecting the tuning parameter. Permutation test is used to determine the structural dimension. Moreover, a variable selection procedure is incorporated into this new approach, which is particularly useful when the model is sparse. The efficacy of the proposed method is demonstrated through simulations and a real data analysis.

Suggested Citation

  • Wang, Qin & Yin, Xiangrong, 2011. "Estimation of inverse mean: An orthogonal series approach," Computational Statistics & Data Analysis, Elsevier, vol. 55(4), pages 1656-1664, April.
  • Handle: RePEc:eee:csdana:v:55:y:2011:i:4:p:1656-1664
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0167-9473(10)00410-X
    Download Restriction: Full text for ScienceDirect subscribers only.
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Ye Z. & Weiss R.E., 2003. "Using the Bootstrap to Select One of a New Class of Dimension Reduction Methods," Journal of the American Statistical Association, American Statistical Association, vol. 98, pages 968-979, January.
    2. Zhu, Lixing & Miao, Baiqi & Peng, Heng, 2006. "On Sliced Inverse Regression With High-Dimensional Covariates," Journal of the American Statistical Association, American Statistical Association, vol. 101, pages 630-643, June.
    3. Amato, U. & Antoniadis, A. & De Feis, I., 2006. "Dimension reduction in functional regression with applications," Computational Statistics & Data Analysis, Elsevier, vol. 50(9), pages 2422-2446, May.
    4. Zhu, Yu & Zeng, Peng, 2006. "Fourier Methods for Estimating the Central Subspace and the Central Mean Subspace in Regression," Journal of the American Statistical Association, American Statistical Association, vol. 101, pages 1638-1651, December.
    5. Yingcun Xia & Howell Tong & W. K. Li & Li‐Xing Zhu, 2002. "An adaptive estimation of dimension reduction space," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 64(3), pages 363-410, August.
    6. Bura E. & Cook R.D., 2001. "Extending Sliced Inverse Regression: the Weighted Chi-Squared Test," Journal of the American Statistical Association, American Statistical Association, vol. 96, pages 996-1003, September.
    7. Efstathia Bura & R. Dennis Cook, 2001. "Estimating the structural dimension of regressions via parametric inverse regression," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 63(2), pages 393-410.
    8. Yin, Xiangrong & Li, Bing & Cook, R. Dennis, 2008. "Successive direction extraction for estimating the central subspace in a multiple-index regression," Journal of Multivariate Analysis, Elsevier, vol. 99(8), pages 1733-1757, September.
    9. Wang, Hansheng & Xia, Yingcun, 2008. "Sliced Regression for Dimension Reduction," Journal of the American Statistical Association, American Statistical Association, vol. 103, pages 811-821, June.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Wang, Tao & Zhu, Lixing, 2013. "Sparse sufficient dimension reduction using optimal scoring," Computational Statistics & Data Analysis, Elsevier, vol. 57(1), pages 223-232.
    2. Scrucca, Luca, 2011. "Model-based SIR for dimension reduction," Computational Statistics & Data Analysis, Elsevier, vol. 55(11), pages 3010-3026, November.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Wang, Pei & Yin, Xiangrong & Yuan, Qingcong & Kryscio, Richard, 2021. "Feature filter for estimating central mean subspace and its sparse solution," Computational Statistics & Data Analysis, Elsevier, vol. 163(C).
    2. Kim, Kyongwon, 2022. "On principal graphical models with application to gene network," Computational Statistics & Data Analysis, Elsevier, vol. 166(C).
    3. Wang, Qin & Xue, Yuan, 2021. "An ensemble of inverse moment estimators for sufficient dimension reduction," Computational Statistics & Data Analysis, Elsevier, vol. 161(C).
    4. Xue, Yuan & Zhang, Nan & Yin, Xiangrong & Zheng, Haitao, 2017. "Sufficient dimension reduction using Hilbert–Schmidt independence criterion," Computational Statistics & Data Analysis, Elsevier, vol. 115(C), pages 67-78.
    5. Zeng, Bilin & Yu, Zhou & Wen, Xuerong Meggie, 2015. "A note on cumulative mean estimation," Statistics & Probability Letters, Elsevier, vol. 96(C), pages 322-327.
    6. Weng, Jiaying, 2022. "Fourier transform sparse inverse regression estimators for sufficient variable selection," Computational Statistics & Data Analysis, Elsevier, vol. 168(C).
    7. Sheng, Wenhui & Yin, Xiangrong, 2013. "Direction estimation in single-index models via distance covariance," Journal of Multivariate Analysis, Elsevier, vol. 122(C), pages 148-161.
    8. Ming-Yueh Huang & Chin-Tsang Chiang, 2017. "An Effective Semiparametric Estimation Approach for the Sufficient Dimension Reduction Model," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 112(519), pages 1296-1310, July.
    9. Wang, Qin & Yao, Weixin, 2012. "An adaptive estimation of MAVE," Journal of Multivariate Analysis, Elsevier, vol. 104(1), pages 88-100, February.
    10. Tao, Chenyang & Feng, Jianfeng, 2017. "Canonical kernel dimension reduction," Computational Statistics & Data Analysis, Elsevier, vol. 107(C), pages 131-148.
    11. Wenjuan Li & Wenying Wang & Jingsi Chen & Weidong Rao, 2023. "Aggregate Kernel Inverse Regression Estimation," Mathematics, MDPI, vol. 11(12), pages 1-10, June.
    12. Rekabdarkolaee, Hossein Moradi & Boone, Edward & Wang, Qin, 2017. "Robust estimation and variable selection in sufficient dimension reduction," Computational Statistics & Data Analysis, Elsevier, vol. 108(C), pages 146-157.
    13. Forzani, Liliana & García Arancibia, Rodrigo & Llop, Pamela & Tomassi, Diego, 2018. "Supervised dimension reduction for ordinal predictors," Computational Statistics & Data Analysis, Elsevier, vol. 125(C), pages 136-155.
    14. Zhang, Jing & Wang, Qin & Mays, D'Arcy, 2021. "Robust MAVE through nonconvex penalized regression," Computational Statistics & Data Analysis, Elsevier, vol. 160(C).
    15. Eliana Christou, 2020. "Robust dimension reduction using sliced inverse median regression," Statistical Papers, Springer, vol. 61(5), pages 1799-1818, October.
    16. Zifang Guo & Lexin Li & Wenbin Lu & Bing Li, 2015. "Groupwise Dimension Reduction via Envelope Method," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 110(512), pages 1515-1527, December.
    17. repec:jss:jstsof:39:i03 is not listed on IDEAS
    18. Iaci, Ross & Yin, Xiangrong & Zhu, Lixing, 2016. "The Dual Central Subspaces in dimension reduction," Journal of Multivariate Analysis, Elsevier, vol. 145(C), pages 178-189.
    19. Zhu, Li-Ping & Yu, Zhou & Zhu, Li-Xing, 2010. "A sparse eigen-decomposition estimation in semiparametric regression," Computational Statistics & Data Analysis, Elsevier, vol. 54(4), pages 976-986, April.
    20. Wu, Runxiong & Chen, Xin, 2021. "MM algorithms for distance covariance based sufficient dimension reduction and sufficient variable selection," Computational Statistics & Data Analysis, Elsevier, vol. 155(C).
    21. Coudret, R. & Girard, S. & Saracco, J., 2014. "A new sliced inverse regression method for multivariate response," Computational Statistics & Data Analysis, Elsevier, vol. 77(C), pages 285-299.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:csdana:v:55:y:2011:i:4:p:1656-1664. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/locate/csda .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.