IDEAS home Printed from https://ideas.repec.org/a/eee/apmaco/v382y2020ics0096300320302940.html
   My bibliography  Save this article

Sparse probabilistic K-means

Author

Listed:
  • Jung, Yoon Mo
  • Whang, Joyce Jiyoung
  • Yun, Sangwoon

Abstract

The goal of clustering is to partition a set of data points into groups of similar data points, called clusters. Clustering algorithms can be classified into two categories: hard and soft clustering. Hard clustering assigns each data point to one cluster exclusively. On the other hand, soft clustering allows probabilistic assignments to clusters. In this paper, we propose a new model which combines the benefits of these two models: clarity of hard clustering and probabilistic assignments of soft clustering. Since the majority of data usually have a clear association, only a few points may require a probabilistic interpretation. Thus, we apply the ℓ1 norm constraint to impose sparsity on probabilistic assignments. Moreover, we also incorporate outlier detection in our clustering model to simultaneously detect outliers which can cause serious problems in statistical analyses. To optimize the model, we introduce an alternating minimization method and prove its convergence. Numerical experiments and comparisons with existing models show the soundness and effectiveness of the proposed model.

Suggested Citation

  • Jung, Yoon Mo & Whang, Joyce Jiyoung & Yun, Sangwoon, 2020. "Sparse probabilistic K-means," Applied Mathematics and Computation, Elsevier, vol. 382(C).
  • Handle: RePEc:eee:apmaco:v:382:y:2020:i:c:s0096300320302940
    DOI: 10.1016/j.amc.2020.125328
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0096300320302940
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.amc.2020.125328?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. P. Tseng, 2001. "Convergence of a Block Coordinate Descent Method for Nondifferentiable Minimization," Journal of Optimization Theory and Applications, Springer, vol. 109(3), pages 475-494, June.
    2. Hui Zou & Trevor Hastie, 2005. "Addendum: Regularization and variable selection via the elastic net," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 67(5), pages 768-768, November.
    3. Hui Zou & Trevor Hastie, 2005. "Regularization and variable selection via the elastic net," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 67(2), pages 301-320, April.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. David Degras, 2021. "Sparse group fused lasso for model segmentation: a hybrid approach," Advances in Data Analysis and Classification, Springer;German Classification Society - Gesellschaft für Klassifikation (GfKl);Japanese Classification Society (JCS);Classification and Data Analysis Group of the Italian Statistical Society (CLADAG);International Federation of Classification Societies (IFCS), vol. 15(3), pages 625-671, September.
    2. Yanming Li & Bin Nan & Ji Zhu, 2015. "Multivariate sparse group lasso for the multivariate multiple linear regression with an arbitrary group structure," Biometrics, The International Biometric Society, vol. 71(2), pages 354-363, June.
    3. Murat Genç, 2022. "A new double-regularized regression using Liu and lasso regularization," Computational Statistics, Springer, vol. 37(1), pages 159-227, March.
    4. Davood Hajinezhad & Qingjiang Shi, 2018. "Alternating direction method of multipliers for a class of nonconvex bilinear optimization: convergence analysis and applications," Journal of Global Optimization, Springer, vol. 70(1), pages 261-288, January.
    5. Yen, Yu-Min & Yen, Tso-Jung, 2014. "Solving norm constrained portfolio optimization via coordinate-wise descent algorithms," Computational Statistics & Data Analysis, Elsevier, vol. 76(C), pages 737-759.
    6. Yunfeng Zhang & Irina Gaynanova, 2022. "Joint association and classification analysis of multi‐view data," Biometrics, The International Biometric Society, vol. 78(4), pages 1614-1625, December.
    7. Baiguo An & Beibei Zhang, 2020. "Logistic regression with image covariates via the combination of L1 and Sobolev regularizations," PLOS ONE, Public Library of Science, vol. 15(6), pages 1-18, June.
    8. Zhigeng Geng & Sijian Wang & Menggang Yu & Patrick O. Monahan & Victoria Champion & Grace Wahba, 2015. "Group variable selection via convex log-exp-sum penalty with application to a breast cancer survivor study," Biometrics, The International Biometric Society, vol. 71(1), pages 53-62, March.
    9. Kaida Cai & Hua Shen & Xuewen Lu, 2022. "Adaptive bi-level variable selection for multivariate failure time model with a diverging number of covariates," TEST: An Official Journal of the Spanish Society of Statistics and Operations Research, Springer;Sociedad de Estadística e Investigación Operativa, vol. 31(4), pages 968-993, December.
    10. Nicholson, William B. & Matteson, David S. & Bien, Jacob, 2017. "VARX-L: Structured regularization for large vector autoregressions with exogenous variables," International Journal of Forecasting, Elsevier, vol. 33(3), pages 627-651.
    11. Petra P. Šimović & Claire Y. T. Chen & Edward W. Sun, 2023. "Classifying the Variety of Customers’ Online Engagement for Churn Prediction with a Mixed-Penalty Logistic Regression," Computational Economics, Springer;Society for Computational Economics, vol. 61(1), pages 451-485, January.
    12. Petra Posedel v{S}imovi'c & Davor Horvatic & Edward W. Sun, 2021. "Classifying variety of customer's online engagement for churn prediction with mixed-penalty logistic regression," Papers 2105.07671, arXiv.org, revised Jul 2021.
    13. Michoel, Tom, 2016. "Natural coordinate descent algorithm for L1-penalised regression in generalised linear models," Computational Statistics & Data Analysis, Elsevier, vol. 97(C), pages 60-70.
    14. Mingrui Zhong & Zanhua Yin & Zhichao Wang, 2023. "Variable Selection for Sparse Logistic Regression with Grouped Variables," Mathematics, MDPI, vol. 11(24), pages 1-21, December.
    15. Rosember Guerra-Urzola & Niek C. Schipper & Anya Tonne & Klaas Sijtsma & Juan C. Vera & Katrijn Deun, 2023. "Sparsifying the least-squares approach to PCA: comparison of lasso and cardinality constraint," Advances in Data Analysis and Classification, Springer;German Classification Society - Gesellschaft für Klassifikation (GfKl);Japanese Classification Society (JCS);Classification and Data Analysis Group of the Italian Statistical Society (CLADAG);International Federation of Classification Societies (IFCS), vol. 17(1), pages 269-286, March.
    16. Tutz, Gerhard & Pößnecker, Wolfgang & Uhlmann, Lorenz, 2015. "Variable selection in general multinomial logit models," Computational Statistics & Data Analysis, Elsevier, vol. 82(C), pages 207-222.
    17. Carstensen, Kai & Heinrich, Markus & Reif, Magnus & Wolters, Maik H., 2020. "Predicting ordinary and severe recessions with a three-state Markov-switching dynamic factor model," International Journal of Forecasting, Elsevier, vol. 36(3), pages 829-850.
    18. Hou-Tai Chang & Ping-Huai Wang & Wei-Fang Chen & Chen-Ju Lin, 2022. "Risk Assessment of Early Lung Cancer with LDCT and Health Examinations," IJERPH, MDPI, vol. 19(8), pages 1-12, April.
    19. Wang, Qiao & Zhou, Wei & Cheng, Yonggang & Ma, Gang & Chang, Xiaolin & Miao, Yu & Chen, E, 2018. "Regularized moving least-square method and regularized improved interpolating moving least-square method with nonsingular moment matrices," Applied Mathematics and Computation, Elsevier, vol. 325(C), pages 120-145.
    20. Mkhadri, Abdallah & Ouhourane, Mohamed, 2013. "An extended variable inclusion and shrinkage algorithm for correlated variables," Computational Statistics & Data Analysis, Elsevier, vol. 57(1), pages 631-644.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:apmaco:v:382:y:2020:i:c:s0096300320302940. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: https://www.journals.elsevier.com/applied-mathematics-and-computation .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.