IDEAS home Printed from https://ideas.repec.org/a/eee/csdana/v103y2016icp193-205.html
   My bibliography  Save this article

Ensemble sufficient dimension folding methods for analyzing matrix-valued data

Author

Listed:
  • Xue, Yuan
  • Yin, Xiangrong
  • Jiang, Xiaolin

Abstract

The construction of novel sufficient dimension folding methods for analyzing matrix-valued data is considered. For a matrix-valued predictor, traditional dimension reduction methods fail to preserve the matrix structure. However, dimension folding methods can preserve the data structure and improve estimation accuracy. Folded-outer product of gradient (folded-OPG) ensemble estimator and two refined estimators, folded-minimum average variance estimation (folded-MAVE) ensemble and folded-sliced regression (folded-SR) ensemble are proposed to recover central dimension folding subspace (CDFS). Due to ensemble idea, estimation accuracies are improved for finite samples by repeatedly using the data. A modified cross validation method is used to determine the structural dimensions of CDFS. Simulated examples demonstrate the performance of folded ensemble methods by comparing with existing inverse dimension folding methods. The efficacy of folded-MAVE ensemble method is also evaluated by comparing with inverse dimension folding methods for analyzing the Standard & Poor’s 500 stock data set.

Suggested Citation

  • Xue, Yuan & Yin, Xiangrong & Jiang, Xiaolin, 2016. "Ensemble sufficient dimension folding methods for analyzing matrix-valued data," Computational Statistics & Data Analysis, Elsevier, vol. 103(C), pages 193-205.
  • Handle: RePEc:eee:csdana:v:103:y:2016:i:c:p:193-205
    DOI: 10.1016/j.csda.2016.05.001
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0167947316301037
    Download Restriction: Full text for ScienceDirect subscribers only.

    File URL: https://libkey.io/10.1016/j.csda.2016.05.001?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. R. Dennis Cook & Xin Zhang, 2014. "Fused Estimators of the Central Subspace in Sufficient Dimension Reduction," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 109(506), pages 815-827, June.
    2. Jianqing Fan & Qiwei Yao & Zongwu Cai, 2003. "Adaptive varying‐coefficient linear models," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 65(1), pages 57-80, February.
    3. Li, Bing & Wen, Songqiao & Zhu, Lixing, 2008. "On a Projective Resampling Method for Dimension Reduction With Multivariate Responses," Journal of the American Statistical Association, American Statistical Association, vol. 103(483), pages 1177-1186.
    4. Zhu, Yu & Zeng, Peng, 2006. "Fourier Methods for Estimating the Central Subspace and the Central Mean Subspace in Regression," Journal of the American Statistical Association, American Statistical Association, vol. 101, pages 1638-1651, December.
    5. Yingcun Xia & Howell Tong & W. K. Li & Li‐Xing Zhu, 2002. "An adaptive estimation of dimension reduction space," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 64(3), pages 363-410, August.
    6. S. N. Wood, 2000. "Modelling and smoothing parameter estimation with multiple quadratic penalties," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 62(2), pages 413-428.
    7. Wang, Hansheng & Xia, Yingcun, 2008. "Sliced Regression for Dimension Reduction," Journal of the American Statistical Association, American Statistical Association, vol. 103, pages 811-821, June.
    8. Simon N. Wood, 2008. "Fast stable direct fitting and smoothness selection for generalized additive models," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 70(3), pages 495-518, July.
    9. Yuan Xue & Xiangrong Yin, 2015. "Sufficient dimension folding for a functional of conditional distribution of matrix- or array-valued objects," Journal of Nonparametric Statistics, Taylor & Francis Journals, vol. 27(2), pages 253-269, June.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Abpeykar, Shadi & Ghatee, Mehdi & Zare, Hadi, 2019. "Ensemble decision forest of RBF networks via hybrid feature clustering approach for high-dimensional data classification," Computational Statistics & Data Analysis, Elsevier, vol. 131(C), pages 12-36.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Weng, Jiaying, 2022. "Fourier transform sparse inverse regression estimators for sufficient variable selection," Computational Statistics & Data Analysis, Elsevier, vol. 168(C).
    2. Wang, Pei & Yin, Xiangrong & Yuan, Qingcong & Kryscio, Richard, 2021. "Feature filter for estimating central mean subspace and its sparse solution," Computational Statistics & Data Analysis, Elsevier, vol. 163(C).
    3. Sheng, Wenhui & Yin, Xiangrong, 2013. "Direction estimation in single-index models via distance covariance," Journal of Multivariate Analysis, Elsevier, vol. 122(C), pages 148-161.
    4. Ming-Yueh Huang & Chin-Tsang Chiang, 2017. "An Effective Semiparametric Estimation Approach for the Sufficient Dimension Reduction Model," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 112(519), pages 1296-1310, July.
    5. Zeng, Bilin & Yu, Zhou & Wen, Xuerong Meggie, 2015. "A note on cumulative mean estimation," Statistics & Probability Letters, Elsevier, vol. 96(C), pages 322-327.
    6. Tao, Chenyang & Feng, Jianfeng, 2017. "Canonical kernel dimension reduction," Computational Statistics & Data Analysis, Elsevier, vol. 107(C), pages 131-148.
    7. Xue, Yuan & Zhang, Nan & Yin, Xiangrong & Zheng, Haitao, 2017. "Sufficient dimension reduction using Hilbert–Schmidt independence criterion," Computational Statistics & Data Analysis, Elsevier, vol. 115(C), pages 67-78.
    8. Zhao, Xiaobing & Zhou, Xian, 2014. "Sufficient dimension reduction on marginal regression for gaps of recurrent events," Journal of Multivariate Analysis, Elsevier, vol. 127(C), pages 56-71.
    9. Zhang, Jing & Wang, Qin & Mays, D'Arcy, 2021. "Robust MAVE through nonconvex penalized regression," Computational Statistics & Data Analysis, Elsevier, vol. 160(C).
    10. Eliana Christou, 2020. "Robust dimension reduction using sliced inverse median regression," Statistical Papers, Springer, vol. 61(5), pages 1799-1818, October.
    11. Zhang, Hong-Fan, 2021. "Minimum Average Variance Estimation with group Lasso for the multivariate response Central Mean Subspace," Journal of Multivariate Analysis, Elsevier, vol. 184(C).
    12. Iaci, Ross & Yin, Xiangrong & Zhu, Lixing, 2016. "The Dual Central Subspaces in dimension reduction," Journal of Multivariate Analysis, Elsevier, vol. 145(C), pages 178-189.
    13. Wu, Runxiong & Chen, Xin, 2021. "MM algorithms for distance covariance based sufficient dimension reduction and sufficient variable selection," Computational Statistics & Data Analysis, Elsevier, vol. 155(C).
    14. Wang, Qin & Yin, Xiangrong, 2011. "Estimation of inverse mean: An orthogonal series approach," Computational Statistics & Data Analysis, Elsevier, vol. 55(4), pages 1656-1664, April.
    15. Yin, Xiangrong & Li, Bing & Cook, R. Dennis, 2008. "Successive direction extraction for estimating the central subspace in a multiple-index regression," Journal of Multivariate Analysis, Elsevier, vol. 99(8), pages 1733-1757, September.
    16. Longhi, Christian & Musolesi, Antonio & Baumont, Catherine, 2014. "Modeling structural change in the European metropolitan areas during the process of economic integration," Economic Modelling, Elsevier, vol. 37(C), pages 395-407.
    17. Strasak, Alexander M. & Umlauf, Nikolaus & Pfeiffer, Ruth M. & Lang, Stefan, 2011. "Comparing penalized splines and fractional polynomials for flexible modelling of the effects of continuous predictor variables," Computational Statistics & Data Analysis, Elsevier, vol. 55(4), pages 1540-1551, April.
    18. Zhu, Xuehu & Chen, Fei & Guo, Xu & Zhu, Lixing, 2016. "Heteroscedasticity testing for regression models: A dimension reduction-based model adaptive approach," Computational Statistics & Data Analysis, Elsevier, vol. 103(C), pages 263-283.
    19. Shujie Ma & Peter X.-K. Song, 2015. "Varying Index Coefficient Models," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 110(509), pages 341-356, March.
    20. Kapla, Daniel & Fertl, Lukas & Bura, Efstathia, 2022. "Fusing sufficient dimension reduction with neural networks," Computational Statistics & Data Analysis, Elsevier, vol. 168(C).

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:csdana:v:103:y:2016:i:c:p:193-205. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/locate/csda .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.