IDEAS home Printed from https://ideas.repec.org/a/bla/scjsta/v48y2021i4p1186-1211.html
   My bibliography  Save this article

Sufficient dimension reduction based on distance‐weighted discrimination

Author

Listed:
  • Hayley Randall
  • Andreas Artemiou
  • Xingye Qiao

Abstract

In this paper, we introduce a sufficient dimension reduction (SDR) algorithm based on distance‐weighted discrimination (DWD). Our methods is shown to be robust on the dimension p of the predictors in our problem, and it also utilizes some new computational results in the DWD literature to propose a computationally faster algorithm than previous classification‐based algorithms in the SDR literature. In addition to the theoretical results of similar methods we prove the consistency of our estimate for fixed p. Finally, we demonstrate the advantages of our algorithm using simulated and real datasets.

Suggested Citation

  • Hayley Randall & Andreas Artemiou & Xingye Qiao, 2021. "Sufficient dimension reduction based on distance‐weighted discrimination," Scandinavian Journal of Statistics, Danish Society for Theoretical Statistics;Finnish Statistical Society;Norwegian Statistical Association;Swedish Statistical Association, vol. 48(4), pages 1186-1211, December.
  • Handle: RePEc:bla:scjsta:v:48:y:2021:i:4:p:1186-1211
    DOI: 10.1111/sjos.12484
    as

    Download full text from publisher

    File URL: https://doi.org/10.1111/sjos.12484
    Download Restriction: no

    File URL: https://libkey.io/10.1111/sjos.12484?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Ye Z. & Weiss R.E., 2003. "Using the Bootstrap to Select One of a New Class of Dimension Reduction Methods," Journal of the American Statistical Association, American Statistical Association, vol. 98, pages 968-979, January.
    2. Marron, J.S. & Todd, Michael J. & Ahn, Jeongyoun, 2007. "Distance-Weighted Discrimination," Journal of the American Statistical Association, American Statistical Association, vol. 102, pages 1267-1271, December.
    3. Boxiang Wang & Hui Zou, 2018. "Another look at distance‐weighted discrimination," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 80(1), pages 177-198, January.
    4. Luke Smallman & Andreas Artemiou, 2017. "A study on imbalance support vector machine algorithms for sufficient dimension reduction," Communications in Statistics - Theory and Methods, Taylor & Francis Journals, vol. 46(6), pages 2751-2763, March.
    5. Li, Bing & Wang, Shaoli, 2007. "On Directional Regression for Dimension Reduction," Journal of the American Statistical Association, American Statistical Association, vol. 102, pages 997-1008, September.
    6. Shin, Seung Jun & Artemiou, Andreas, 2017. "Penalized principal logistic regression for sparse sufficient dimension reduction," Computational Statistics & Data Analysis, Elsevier, vol. 111(C), pages 48-58.
    7. Seung Jun Shin & Yichao Wu & Hao Helen Zhang & Yufeng Liu, 2017. "Principal weighted support vector machines for sufficient dimension reduction in binary classification," Biometrika, Biometrika Trust, vol. 104(1), pages 67-81.
    8. Yin, Xiangrong & Li, Bing & Cook, R. Dennis, 2008. "Successive direction extraction for estimating the central subspace in a multiple-index regression," Journal of Multivariate Analysis, Elsevier, vol. 99(8), pages 1733-1757, September.
    9. Zhou, Jingke & Zhu, Lixing, 2016. "Principal minimax support vector machine for sufficient dimension reduction with contaminated data," Computational Statistics & Data Analysis, Elsevier, vol. 94(C), pages 33-48.
    10. Qiao, Xingye & Zhang, Hao Helen & Liu, Yufeng & Todd, Michael J. & Marron, J. S., 2010. "Weighted Distance Weighted Discrimination and Its Asymptotic Properties," Journal of the American Statistical Association, American Statistical Association, vol. 105(489), pages 401-414.
    11. Wei Luo & Bing Li, 2016. "Combining eigenvalues and variation of eigenvectors for order determination," Biometrika, Biometrika Trust, vol. 103(4), pages 875-887.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Jang, Hyun Jung & Shin, Seung Jun & Artemiou, Andreas, 2023. "Principal weighted least square support vector machine: An online dimension-reduction tool for binary classification," Computational Statistics & Data Analysis, Elsevier, vol. 187(C).
    2. Wang, Pei & Yin, Xiangrong & Yuan, Qingcong & Kryscio, Richard, 2021. "Feature filter for estimating central mean subspace and its sparse solution," Computational Statistics & Data Analysis, Elsevier, vol. 163(C).
    3. Wang, Qin & Xue, Yuan, 2021. "An ensemble of inverse moment estimators for sufficient dimension reduction," Computational Statistics & Data Analysis, Elsevier, vol. 161(C).
    4. Pircalabelu, Eugen & Artemiou, Andreas, 2021. "Graph informed sliced inverse regression," Computational Statistics & Data Analysis, Elsevier, vol. 164(C).
    5. Wei Luo, 2022. "On efficient dimension reduction with respect to the interaction between two response variables," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 84(2), pages 269-294, April.
    6. Pircalabelu, Eugen & Artemiou, Andreas, 2020. "The LassoPSVM approach for sufficient dimension reduction using principal projections," LIDAM Discussion Papers ISBA 2020008, Université catholique de Louvain, Institute of Statistics, Biostatistics and Actuarial Sciences (ISBA).
    7. Kim, Kyongwon, 2022. "On principal graphical models with application to gene network," Computational Statistics & Data Analysis, Elsevier, vol. 166(C).
    8. Fang, Fang & Yu, Zhou, 2020. "Model averaging assisted sufficient dimension reduction," Computational Statistics & Data Analysis, Elsevier, vol. 152(C).
    9. Yu, Zhou & Dong, Yuexiao & Huang, Mian, 2014. "General directional regression," Journal of Multivariate Analysis, Elsevier, vol. 124(C), pages 94-104.
    10. Yu, Zhou & Dong, Yuexiao & Guo, Ranwei, 2013. "On determining the structural dimension via directional regression," Statistics & Probability Letters, Elsevier, vol. 83(4), pages 987-992.
    11. Wang, Tao & Zhu, Lixing, 2013. "Sparse sufficient dimension reduction using optimal scoring," Computational Statistics & Data Analysis, Elsevier, vol. 57(1), pages 223-232.
    12. Zhu, Li-Ping & Yu, Zhou & Zhu, Li-Xing, 2010. "A sparse eigen-decomposition estimation in semiparametric regression," Computational Statistics & Data Analysis, Elsevier, vol. 54(4), pages 976-986, April.
    13. Qin Wang & Yuan Xue, 2023. "A structured covariance ensemble for sufficient dimension reduction," Advances in Data Analysis and Classification, Springer;German Classification Society - Gesellschaft für Klassifikation (GfKl);Japanese Classification Society (JCS);Classification and Data Analysis Group of the Italian Statistical Society (CLADAG);International Federation of Classification Societies (IFCS), vol. 17(3), pages 777-800, September.
    14. Park, Yujin & Kim, Kyongwon & Yoo, Jae Keun, 2022. "On cross-distance selection algorithm for hybrid sufficient dimension reduction," Computational Statistics & Data Analysis, Elsevier, vol. 176(C).
    15. Xie, Chuanlong & Zhu, Lixing, 2020. "Generalized kernel-based inverse regression methods for sufficient dimension reduction," Computational Statistics & Data Analysis, Elsevier, vol. 150(C).
    16. Stephen Babos & Andreas Artemiou, 2020. "Sliced inverse median difference regression," Statistical Methods & Applications, Springer;Società Italiana di Statistica, vol. 29(4), pages 937-954, December.
    17. Li, Junlan & Wang, Tao, 2021. "Dimension reduction in binary response regression: A joint modeling approach," Computational Statistics & Data Analysis, Elsevier, vol. 156(C).
    18. Stephen Babos & Andreas Artemiou, 2021. "Cumulative Median Estimation for Sufficient Dimension Reduction," Stats, MDPI, vol. 4(1), pages 1-8, February.
    19. Weng, Jiaying, 2022. "Fourier transform sparse inverse regression estimators for sufficient variable selection," Computational Statistics & Data Analysis, Elsevier, vol. 168(C).
    20. Yugo Nakayama & Kazuyoshi Yata & Makoto Aoshima, 2020. "Bias-corrected support vector machine with Gaussian kernel in high-dimension, low-sample-size settings," Annals of the Institute of Statistical Mathematics, Springer;The Institute of Statistical Mathematics, vol. 72(5), pages 1257-1286, October.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:bla:scjsta:v:48:y:2021:i:4:p:1186-1211. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Wiley Content Delivery (email available below). General contact details of provider: http://www.blackwellpublishing.com/journal.asp?ref=0303-6898 .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.