IDEAS home Printed from https://ideas.repec.org/a/eee/csdana/v203y2025ics0167947324001555.html
   My bibliography  Save this article

Online kernel sliced inverse regression

Author

Listed:
  • Xu, Jianjun
  • Zhao, Yue
  • Cheng, Haoyang

Abstract

Online dimension reduction techniques are widely utilized for handling high-dimensional streaming data. Extensive research has been conducted on various methods, including Online Principal Component Analysis, Online Sliced Inverse Regression (OSIR), and Online Kernel Principal Component Analysis (OKPCA). However, it is important to note that the exploration of online supervised nonlinear dimension reduction techniques is still limited. This article presents a novel approach called Online Kernel Sliced Inverse Regression (OKSIR), which specifically tackles the challenge of dealing with the increasing dimension of the kernel matrix as the sample size grows. The proposed method incorporates two key components: the approximate linear dependence condition and dictionary variable sets. These components enable a reduced-order approach for online variable updates, improving the efficiency of the process. To solve the OKSIR problem, we formulate it as an online generalized eigen-decomposition problem and employ stochastic optimization techniques to update the dimension reduction directions. Theoretical properties of this online learner are established, providing a solid foundation for its application. Through extensive simulations and real data analysis, we demonstrate that the proposed OKSIR method achieves performance comparable to that of batch processing kernel sliced inverse regression. This research significantly contributes to the advancement of online dimension reduction techniques, enhancing their effectiveness in practical applications.

Suggested Citation

  • Xu, Jianjun & Zhao, Yue & Cheng, Haoyang, 2025. "Online kernel sliced inverse regression," Computational Statistics & Data Analysis, Elsevier, vol. 203(C).
  • Handle: RePEc:eee:csdana:v:203:y:2025:i:c:s0167947324001555
    DOI: 10.1016/j.csda.2024.108071
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0167947324001555
    Download Restriction: Full text for ScienceDirect subscribers only.

    File URL: https://libkey.io/10.1016/j.csda.2024.108071?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Lu Li & Xuerong Meggie Wen & Zhou Yu, 2020. "A selective overview of sparse sufficient dimension reduction," Statistical Theory and Related Fields, Taylor & Francis Journals, vol. 4(2), pages 121-133, July.
    2. Lu Li & Xuewong Meggie Wen & Zhou Yu, 2020. "Rejoinder on ‘A selective overview of sparse sufficient dimension reduction’," Statistical Theory and Related Fields, Taylor & Francis Journals, vol. 4(2), pages 151-151, July.
    3. Qiang Wu & Feng Liang & Sayan Mukherjee, 2013. "Kernel Sliced Inverse Regression: Regularization and Consistency," Abstract and Applied Analysis, John Wiley & Sons, vol. 2013(1).
    4. Andreas Alfons & Christophe Croux & Peter Filzmoser, 2017. "Robust Maximum Association Estimators," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 112(517), pages 436-445, January.
    5. Feng Zhao & Islem Rekik & Seong-Whan Lee & Jing Liu & Junying Zhang & Dinggang Shen, 2019. "Two-Phase Incremental Kernel PCA for Learning Massive or Online Datasets," Complexity, Hindawi, vol. 2019, pages 1-17, February.
    6. Wenquan Cui & Jianjun Xu & Yuehua Wu, 2023. "A new reproducing kernel‐based nonlinear dimension reduction method for survival data," Scandinavian Journal of Statistics, Danish Society for Theoretical Statistics;Finnish Statistical Society;Norwegian Statistical Association;Swedish Statistical Association, vol. 50(3), pages 1365-1390, September.
    7. Qiang Wu & Feng Liang & Sayan Mukherjee, 2013. "Kernel Sliced Inverse Regression: Regularization and Consistency," Abstract and Applied Analysis, Hindawi, vol. 2013, pages 1-11, July.
    8. Hervé Cardot & David Degras, 2018. "Online Principal Component Analysis in High Dimension: Which Algorithm to Choose?," International Statistical Review, International Statistical Institute, vol. 86(1), pages 29-50, April.
    9. Yanyuan Ma & Liping Zhu, 2013. "A Review on Dimension Reduction," International Statistical Review, International Statistical Institute, vol. 81(1), pages 134-150, April.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Wenquan Cui & Jianjun Xu & Yuehua Wu, 2023. "A new reproducing kernel‐based nonlinear dimension reduction method for survival data," Scandinavian Journal of Statistics, Danish Society for Theoretical Statistics;Finnish Statistical Society;Norwegian Statistical Association;Swedish Statistical Association, vol. 50(3), pages 1365-1390, September.
    2. Chen, Canyi & Xu, Wangli & Zhu, Liping, 2022. "Distributed estimation in heterogeneous reduced rank regression: With application to order determination in sufficient dimension reduction," Journal of Multivariate Analysis, Elsevier, vol. 190(C).
    3. Hung Hung & Su‐Yun Huang, 2019. "Sufficient dimension reduction via random‐partitions for the large‐p‐small‐n problem," Biometrics, The International Biometric Society, vol. 75(1), pages 245-255, March.
    4. Hojin Yang & Hongtu Zhu & Joseph G. Ibrahim, 2018. "MILFM: Multiple index latent factor model based on high‐dimensional features," Biometrics, The International Biometric Society, vol. 74(3), pages 834-844, September.
    5. Cheng, Qing & Zhu, Liping, 2017. "On relative efficiency of principal Hessian directions," Statistics & Probability Letters, Elsevier, vol. 126(C), pages 108-113.
    6. Alvarez, Agustín & Boente, Graciela & Kudraszow, Nadia, 2019. "Robust sieve estimators for functional canonical correlation analysis," Journal of Multivariate Analysis, Elsevier, vol. 170(C), pages 46-62.
    7. Kapla, Daniel & Fertl, Lukas & Bura, Efstathia, 2022. "Fusing sufficient dimension reduction with neural networks," Computational Statistics & Data Analysis, Elsevier, vol. 168(C).
    8. Dong, Yushen & Wu, Yichao, 2022. "Fréchet kernel sliced inverse regression," Journal of Multivariate Analysis, Elsevier, vol. 191(C).
    9. Liebscher, Eckhard, 2021. "Kendall regression coefficient," Computational Statistics & Data Analysis, Elsevier, vol. 157(C).
    10. Jarek Duda, 2023. "Adaptive Student's t-distribution with method of moments moving estimator for nonstationary time series," Papers 2304.03069, arXiv.org, revised Apr 2025.
    11. Nathan Uyttendaele, 2018. "On the estimation of nested Archimedean copulas: a theoretical and an experimental comparison," Computational Statistics, Springer, vol. 33(2), pages 1047-1070, June.
    12. Baek, Seungchul & Hoyoung, Park & Park, Junyong, 2024. "Variable selection using data splitting and projection for principal fitted component models in high dimension," Computational Statistics & Data Analysis, Elsevier, vol. 196(C).
    13. Lu Li & Kai Tan & Xuerong Meggie Wen & Zhou Yu, 2023. "Variable-dependent partial dimension reduction," TEST: An Official Journal of the Spanish Society of Statistics and Operations Research, Springer;Sociedad de Estadística e Investigación Operativa, vol. 32(2), pages 521-541, June.
    14. Lei Wang, 2019. "Dimension reduction for kernel-assisted M-estimators with missing response at random," Annals of the Institute of Statistical Mathematics, Springer;The Institute of Statistical Mathematics, vol. 71(4), pages 889-910, August.
    15. Barbarino, Alessandro & Bura, Efstathia, 2024. "Forecasting Near-equivalence of Linear Dimension Reduction Methods in Large Panels of Macro-variables," Econometrics and Statistics, Elsevier, vol. 31(C), pages 1-18.
    16. Dong, Yuexiao & Li, Zeda, 2024. "A note on marginal coordinate test in sufficient dimension reduction," Statistics & Probability Letters, Elsevier, vol. 204(C).
    17. Nordhausen, Klaus & Ruiz-Gazen, Anne, 2022. "On the usage of joint diagonalization in multivariate statistics," Journal of Multivariate Analysis, Elsevier, vol. 188(C).
    18. Monnez, Jean-Marie & Skiredj, Abderrahman, 2021. "Widening the scope of an eigenvector stochastic approximation process and application to streaming PCA and related methods," Journal of Multivariate Analysis, Elsevier, vol. 182(C).
    19. Pircalabelu, Eugen & Artemiou, Andreas, 2021. "Graph informed sliced inverse regression," Computational Statistics & Data Analysis, Elsevier, vol. 164(C).
    20. A. Iodice D’Enza & A. Markos & F. Palumbo, 2022. "Chunk-wise regularised PCA-based imputation of missing data," Statistical Methods & Applications, Springer;Società Italiana di Statistica, vol. 31(2), pages 365-386, June.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:csdana:v:203:y:2025:i:c:s0167947324001555. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/locate/csda .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.