IDEAS home Printed from https://ideas.repec.org/a/bla/istatr/v91y2023i2p218-242.html
   My bibliography  Save this article

Path algorithms for fused lasso signal approximator with application to COVID‐19 spread in Korea

Author

Listed:
  • Won Son
  • Johan Lim
  • Donghyeon Yu

Abstract

The fused lasso signal approximator (FLSA) is a smoothing procedure for noisy observations that uses fused lasso penalty on unobserved mean levels to find sparse signal blocks. Several path algorithms have been developed to obtain the whole solution path of the FLSA. However, it is known that the FLSA has model selection inconsistency when the underlying signals have a stair‐case block, where three consecutive signal blocks are either strictly increasing or decreasing. Modified path algorithms for the FLSA have been proposed to guarantee model selection consistency regardless of the stair‐case block. In this paper, we provide a comprehensive review of the path algorithms for the FLSA and prove the properties of the recently modified path algorithms' hitting times. Specifically, we reinterpret the modified path algorithm as the path algorithm for local FLSA problems and reveal the condition that the hitting time for the fusion of the modified path algorithm is not monotone in a tuning parameter. To recover the monotonicity of the solution path, we propose a pathwise adaptive FLSA having monotonicity with similar performance as the modified solution path algorithm. Finally, we apply the proposed method to the number of daily‐confirmed cases of COVID‐19 in Korea to identify the change points of its spread.

Suggested Citation

  • Won Son & Johan Lim & Donghyeon Yu, 2023. "Path algorithms for fused lasso signal approximator with application to COVID‐19 spread in Korea," International Statistical Review, International Statistical Institute, vol. 91(2), pages 218-242, August.
  • Handle: RePEc:bla:istatr:v:91:y:2023:i:2:p:218-242
    DOI: 10.1111/insr.12521
    as

    Download full text from publisher

    File URL: https://doi.org/10.1111/insr.12521
    Download Restriction: no

    File URL: https://libkey.io/10.1111/insr.12521?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Robert Tibshirani & Michael Saunders & Saharon Rosset & Ji Zhu & Keith Knight, 2005. "Sparsity and smoothness via the fused lasso," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 67(1), pages 91-108, February.
    2. Jiahua Chen & Zehua Chen, 2008. "Extended Bayesian information criteria for model selection with large model spaces," Biometrika, Biometrika Trust, vol. 95(3), pages 759-771.
    3. Yao, Yi-Ching, 1988. "Estimating the number of change-points via Schwarz' criterion," Statistics & Probability Letters, Elsevier, vol. 6(3), pages 181-189, February.
    4. Fryzlewicz, Piotr, 2014. "Wild binary segmentation for multiple change-point detection," LSE Research Online Documents on Economics 57146, London School of Economics and Political Science, LSE Library.
    5. Qian, Junyang & Jia, Jinzhu, 2016. "On stepwise pattern recovery of the fused Lasso," Computational Statistics & Data Analysis, Elsevier, vol. 94(C), pages 221-237.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Fryzlewicz, Piotr, 2020. "Detecting possibly frequent change-points: Wild Binary Segmentation 2 and steepest-drop model selection," LSE Research Online Documents on Economics 103430, London School of Economics and Political Science, LSE Library.
    2. David Degras, 2021. "Sparse group fused lasso for model segmentation: a hybrid approach," Advances in Data Analysis and Classification, Springer;German Classification Society - Gesellschaft für Klassifikation (GfKl);Japanese Classification Society (JCS);Classification and Data Analysis Group of the Italian Statistical Society (CLADAG);International Federation of Classification Societies (IFCS), vol. 15(3), pages 625-671, September.
    3. Haeran Cho & Claudia Kirch, 2022. "Two-stage data segmentation permitting multiscale change points, heavy tails and dependence," Annals of the Institute of Statistical Mathematics, Springer;The Institute of Statistical Mathematics, vol. 74(4), pages 653-684, August.
    4. McGonigle, Euan T. & Cho, Haeran, 2023. "Robust multiscale estimation of time-average variance for time series segmentation," Computational Statistics & Data Analysis, Elsevier, vol. 179(C).
    5. Lu Tang & Ling Zhou & Peter X. K. Song, 2019. "Fusion learning algorithm to combine partially heterogeneous Cox models," Computational Statistics, Springer, vol. 34(1), pages 395-414, March.
    6. Molly C. Klanderman & Kathryn B. Newhart & Tzahi Y. Cath & Amanda S. Hering, 2020. "Fault isolation for a complex decentralized waste water treatment facility," Journal of the Royal Statistical Society Series C, Royal Statistical Society, vol. 69(4), pages 931-951, August.
    7. Shi, Xuesheng & Gallagher, Colin & Lund, Robert & Killick, Rebecca, 2022. "A comparison of single and multiple changepoint techniques for time series data," Computational Statistics & Data Analysis, Elsevier, vol. 170(C).
    8. Qifan Song & Guang Cheng, 2020. "Bayesian Fusion Estimation via t Shrinkage," Sankhya A: The Indian Journal of Statistics, Springer;Indian Statistical Institute, vol. 82(2), pages 353-385, August.
    9. Benjamin G. Stokell & Rajen D. Shah & Ryan J. Tibshirani, 2021. "Modelling high‐dimensional categorical data using nonconvex fusion penalties," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 83(3), pages 579-611, July.
    10. Wu Wang & Xuming He & Zhongyi Zhu, 2020. "Statistical inference for multiple change‐point models," Scandinavian Journal of Statistics, Danish Society for Theoretical Statistics;Finnish Statistical Society;Norwegian Statistical Association;Swedish Statistical Association, vol. 47(4), pages 1149-1170, December.
    11. Davis, Richard A. & Hancock, Stacey A. & Yao, Yi-Ching, 2016. "On consistency of minimum description length model selection for piecewise autoregressions," Journal of Econometrics, Elsevier, vol. 194(2), pages 360-368.
    12. Sakyajit Bhattacharya & Paul McNicholas, 2014. "A LASSO-penalized BIC for mixture model selection," Advances in Data Analysis and Classification, Springer;German Classification Society - Gesellschaft für Klassifikation (GfKl);Japanese Classification Society (JCS);Classification and Data Analysis Group of the Italian Statistical Society (CLADAG);International Federation of Classification Societies (IFCS), vol. 8(1), pages 45-61, March.
    13. Venkata Jandhyala & Stergios Fotopoulos & Ian MacNeill & Pengyu Liu, 2013. "Inference for single and multiple change-points in time series," Journal of Time Series Analysis, Wiley Blackwell, vol. 34(4), pages 423-446, July.
    14. Florian Pein & Hannes Sieling & Axel Munk, 2017. "Heterogeneous change point inference," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 79(4), pages 1207-1227, September.
    15. Zhao, Xin & Zhang, Jingru & Lin, Wei, 2023. "Clustering multivariate count data via Dirichlet-multinomial network fusion," Computational Statistics & Data Analysis, Elsevier, vol. 179(C).
    16. Jiang, He & Luo, Shihua & Dong, Yao, 2021. "Simultaneous feature selection and clustering based on square root optimization," European Journal of Operational Research, Elsevier, vol. 289(1), pages 214-231.
    17. Holger Dette & Theresa Eckle & Mathias Vetter, 2020. "Multiscale change point detection for dependent data," Scandinavian Journal of Statistics, Danish Society for Theoretical Statistics;Finnish Statistical Society;Norwegian Statistical Association;Swedish Statistical Association, vol. 47(4), pages 1243-1274, December.
    18. Canhong Wen & Xueqin Wang & Aijun Zhang, 2023. "ℓ 0 Trend Filtering," INFORMS Journal on Computing, INFORMS, vol. 35(6), pages 1491-1510, November.
    19. Kang-Ping Lu & Shao-Tung Chang, 2023. "An Advanced Segmentation Approach to Piecewise Regression Models," Mathematics, MDPI, vol. 11(24), pages 1-23, December.
    20. Trevor Harris & Bo Li & J. Derek Tucker, 2022. "Scalable multiple changepoint detection for functional data sequences," Environmetrics, John Wiley & Sons, Ltd., vol. 33(2), March.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:bla:istatr:v:91:y:2023:i:2:p:218-242. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Wiley Content Delivery (email available below). General contact details of provider: https://edirc.repec.org/data/isiiinl.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.