IDEAS home Printed from https://ideas.repec.org/a/eee/jmvana/v184y2021ics0047259x21000348.html
   My bibliography  Save this article

Kick-one-out-based variable selection method for Euclidean distance-based classifier in high-dimensional settings

Author

Listed:
  • Nakagawa, Tomoyuki
  • Watanabe, Hiroki
  • Hyodo, Masashi

Abstract

This paper presents a variable selection method for the Euclidean distance-based classifier in high-dimensional settings. We are concerned that the expected probabilities of misclassification (EPMC) for the Euclidean distance-based classifier may be increasing with dimension when redundant variables are included in feature values. First, we show the Euclidean distance-based classifier with only non-redundant variables reduces asymptotic EPMC more than the Euclidean distance-based classifier with all variables. Next, we obtain a kick-one-out based variable selection method that helps reduce EPMC and prove its consistency in variable selection in the context of high dimensionality. Finally, we conduct a Monte Carlo simulation study to examine the finite sample performance of the proposed selection method. Our simulation results show that the selection method frequently selects the set containing non-redundant variables. We also observed that the discrimination rules constructed from the selected variables reduce EPMC more than the discrimination rules constructed from all variables.

Suggested Citation

  • Nakagawa, Tomoyuki & Watanabe, Hiroki & Hyodo, Masashi, 2021. "Kick-one-out-based variable selection method for Euclidean distance-based classifier in high-dimensional settings," Journal of Multivariate Analysis, Elsevier, vol. 184(C).
  • Handle: RePEc:eee:jmvana:v:184:y:2021:i:c:s0047259x21000348
    DOI: 10.1016/j.jmva.2021.104756
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0047259X21000348
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.jmva.2021.104756?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Makoto Aoshima & Kazuyoshi Yata, 2019. "Distance-based classifier by data transformation for high-dimension, strongly spiked eigenvalue models," Annals of the Institute of Statistical Mathematics, Springer;The Institute of Statistical Mathematics, vol. 71(3), pages 473-503, June.
    2. Oda, Ryoya & Suzuki, Yuya & Yanagihara, Hirokazu & Fujikoshi, Yasunori, 2020. "A consistent variable selection method in high-dimensional canonical discriminant analysis," Journal of Multivariate Analysis, Elsevier, vol. 175(C).
    3. Hyodo, Masashi & Kubokawa, Tatsuya, 2014. "A variable selection criterion for linear discriminant rule and its optimality in high dimensional and large sample data," Journal of Multivariate Analysis, Elsevier, vol. 123(C), pages 364-379.
    4. Makoto Aoshima & Kazuyoshi Yata, 2014. "A distance-based, misclassification rate adjusted classifier for multiclass, high-dimensional data," Annals of the Institute of Statistical Mathematics, Springer;The Institute of Statistical Mathematics, vol. 66(5), pages 983-1010, October.
    5. Dudoit S. & Fridlyand J. & Speed T. P, 2002. "Comparison of Discrimination Methods for the Classification of Tumors Using Gene Expression Data," Journal of the American Statistical Association, American Statistical Association, vol. 97, pages 77-87, March.
    6. Zhao, L. C. & Krishnaiah, P. R. & Bai, Z. D., 1986. "On detection of the number of signals in presence of white noise," Journal of Multivariate Analysis, Elsevier, vol. 20(1), pages 1-25, October.
    7. Watanabe, Hiroki & Hyodo, Masashi & Seo, Takashi & Pavlenko, Tatjana, 2015. "Asymptotic properties of the misclassification rates for Euclidean Distance Discriminant rule in high-dimensional data," Journal of Multivariate Analysis, Elsevier, vol. 140(C), pages 234-244.
    8. Fujikoshi, Yasunori, 1985. "Selection of variables in two-group discriminant analysis by error rate and Akaike's information criteria," Journal of Multivariate Analysis, Elsevier, vol. 17(1), pages 27-37, August.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Ishii, Aki & Yata, Kazuyoshi & Aoshima, Makoto, 2022. "Geometric classifiers for high-dimensional noisy data," Journal of Multivariate Analysis, Elsevier, vol. 188(C).
    2. Makoto Aoshima & Kazuyoshi Yata, 2019. "Distance-based classifier by data transformation for high-dimension, strongly spiked eigenvalue models," Annals of the Institute of Statistical Mathematics, Springer;The Institute of Statistical Mathematics, vol. 71(3), pages 473-503, June.
    3. Yugo Nakayama & Kazuyoshi Yata & Makoto Aoshima, 2020. "Bias-corrected support vector machine with Gaussian kernel in high-dimension, low-sample-size settings," Annals of the Institute of Statistical Mathematics, Springer;The Institute of Statistical Mathematics, vol. 72(5), pages 1257-1286, October.
    4. Yasunori Fujikoshi & Tetsuro Sakurai, 2023. "High-Dimensional Consistencies of KOO Methods for the Selection of Variables in Multivariate Linear Regression Models with Covariance Structures," Mathematics, MDPI, vol. 11(3), pages 1-15, January.
    5. Nakayama, Yugo & Yata, Kazuyoshi & Aoshima, Makoto, 2021. "Clustering by principal component analysis with Gaussian kernel in high-dimension, low-sample-size settings," Journal of Multivariate Analysis, Elsevier, vol. 185(C).
    6. Rauf Ahmad, M. & Pavlenko, Tatjana, 2018. "A U-classifier for high-dimensional data under non-normality," Journal of Multivariate Analysis, Elsevier, vol. 167(C), pages 269-283.
    7. Makoto Aoshima & Kazuyoshi Yata, 2019. "High-Dimensional Quadratic Classifiers in Non-sparse Settings," Methodology and Computing in Applied Probability, Springer, vol. 21(3), pages 663-682, September.
    8. Aleksey I. Shinkevich & Alsu R. Akhmetshina & Ruslan R. Khalilov, 2022. "Development of a Methodology for Forecasting the Sustainable Development of Industry in Russia Based on the Tools of Factor and Discriminant Analysis," Mathematics, MDPI, vol. 10(6), pages 1-16, March.
    9. Kubokawa, Tatsuya & Srivastava, Muni S., 2008. "Estimation of the precision matrix of a singular Wishart distribution and its application in high-dimensional data," Journal of Multivariate Analysis, Elsevier, vol. 99(9), pages 1906-1928, October.
    10. Hossain, Ahmed & Beyene, Joseph & Willan, Andrew R. & Hu, Pingzhao, 2009. "A flexible approximate likelihood ratio test for detecting differential expression in microarray data," Computational Statistics & Data Analysis, Elsevier, vol. 53(10), pages 3685-3695, August.
    11. Luca Scrucca, 2014. "Graphical tools for model-based mixture discriminant analysis," Advances in Data Analysis and Classification, Springer;German Classification Society - Gesellschaft für Klassifikation (GfKl);Japanese Classification Society (JCS);Classification and Data Analysis Group of the Italian Statistical Society (CLADAG);International Federation of Classification Societies (IFCS), vol. 8(2), pages 147-165, June.
    12. Bilin Zeng & Xuerong Meggie Wen & Lixing Zhu, 2017. "A link-free sparse group variable selection method for single-index model," Journal of Applied Statistics, Taylor & Francis Journals, vol. 44(13), pages 2388-2400, October.
    13. Watanabe, Hiroki & Hyodo, Masashi & Seo, Takashi & Pavlenko, Tatjana, 2015. "Asymptotic properties of the misclassification rates for Euclidean Distance Discriminant rule in high-dimensional data," Journal of Multivariate Analysis, Elsevier, vol. 140(C), pages 234-244.
    14. Oda, Ryoya & Suzuki, Yuya & Yanagihara, Hirokazu & Fujikoshi, Yasunori, 2020. "A consistent variable selection method in high-dimensional canonical discriminant analysis," Journal of Multivariate Analysis, Elsevier, vol. 175(C).
    15. J. Burez & D. Van Den Poel, 2005. "CRM at a Pay-TV Company: Using Analytical Models to Reduce Customer Attrition by Targeted Marketing for Subscription Services," Working Papers of Faculty of Economics and Business Administration, Ghent University, Belgium 05/348, Ghent University, Faculty of Economics and Business Administration.
    16. Won, Joong-Ho & Lim, Johan & Yu, Donghyeon & Kim, Byung Soo & Kim, Kyunga, 2014. "Monotone false discovery rate," Statistics & Probability Letters, Elsevier, vol. 87(C), pages 86-93.
    17. Jan, Budczies & Kosztyla, Daniel & von Törne, Christian & Stenzinger, Albrecht & Darb-Esfahani, Silvia & Dietel, Manfred & Denkert, Carsten, 2014. "cancerclass: An R Package for Development and Validation of Diagnostic Tests from High-Dimensional Molecular Data," Journal of Statistical Software, Foundation for Open Access Statistics, vol. 59(i01).
    18. Jianqing Fan & Yang Feng & Jiancheng Jiang & Xin Tong, 2016. "Feature Augmentation via Nonparametrics and Selection (FANS) in High-Dimensional Classification," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 111(513), pages 275-287, March.
    19. Márton Gosztonyi & Csákné Filep Judit, 2022. "Profiling (Non-)Nascent Entrepreneurs in Hungary Based on Machine Learning Approaches," Sustainability, MDPI, vol. 14(6), pages 1-20, March.
    20. Wang, Tao & Xu, Pei-Rong & Zhu, Li-Xing, 2012. "Non-convex penalized estimation in high-dimensional models with single-index structure," Journal of Multivariate Analysis, Elsevier, vol. 109(C), pages 221-235.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:jmvana:v:184:y:2021:i:c:s0047259x21000348. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/wps/find/journaldescription.cws_home/622892/description#description .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.