IDEAS home Printed from https://ideas.repec.org/a/plo/pone00/0267976.html
   My bibliography  Save this article

SinGAN-Seg: Synthetic training data generation for medical image segmentation

Author

Listed:
  • Vajira Thambawita
  • Pegah Salehi
  • Sajad Amouei Sheshkal
  • Steven A Hicks
  • Hugo L Hammer
  • Sravanthi Parasa
  • Thomas de Lange
  • Pål Halvorsen
  • Michael A Riegler

Abstract

Analyzing medical data to find abnormalities is a time-consuming and costly task, particularly for rare abnormalities, requiring tremendous efforts from medical experts. Therefore, artificial intelligence has become a popular tool for the automatic processing of medical data, acting as a supportive tool for doctors. However, the machine learning models used to build these tools are highly dependent on the data used to train them. Large amounts of data can be difficult to obtain in medicine due to privacy reasons, expensive and time-consuming annotations, and a general lack of data samples for infrequent lesions. In this study, we present a novel synthetic data generation pipeline, called SinGAN-Seg, to produce synthetic medical images with corresponding masks using a single training image. Our method is different from the traditional generative adversarial networks (GANs) because our model needs only a single image and the corresponding ground truth to train. We also show that the synthetic data generation pipeline can be used to produce alternative artificial segmentation datasets with corresponding ground truth masks when real datasets are not allowed to share. The pipeline is evaluated using qualitative and quantitative comparisons between real data and synthetic data to show that the style transfer technique used in our pipeline significantly improves the quality of the generated data and our method is better than other state-of-the-art GANs to prepare synthetic images when the size of training datasets are limited. By training UNet++ using both real data and the synthetic data generated from the SinGAN-Seg pipeline, we show that the models trained on synthetic data have very close performances to those trained on real data when both datasets have a considerable amount of training data. In contrast, we show that synthetic data generated from the SinGAN-Seg pipeline improves the performance of segmentation models when training datasets do not have a considerable amount of data. All experiments were performed using an open dataset and the code is publicly available on GitHub.

Suggested Citation

  • Vajira Thambawita & Pegah Salehi & Sajad Amouei Sheshkal & Steven A Hicks & Hugo L Hammer & Sravanthi Parasa & Thomas de Lange & Pål Halvorsen & Michael A Riegler, 2022. "SinGAN-Seg: Synthetic training data generation for medical image segmentation," PLOS ONE, Public Library of Science, vol. 17(5), pages 1-24, May.
  • Handle: RePEc:plo:pone00:0267976
    DOI: 10.1371/journal.pone.0267976
    as

    Download full text from publisher

    File URL: https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0267976
    Download Restriction: no

    File URL: https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0267976&type=printable
    Download Restriction: no

    File URL: https://libkey.io/10.1371/journal.pone.0267976?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Fei Chao & Gan Lin & Ling Zheng & Xiang Chang & Chih-Min Lin & Longzhi Yang & Changjing Shang, 2020. "An LSTM Based Generative Adversarial Architecture for Robotic Calligraphy Learning System," Sustainability, MDPI, vol. 12(21), pages 1-11, October.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.

      More about this item

      Statistics

      Access and download statistics

      Corrections

      All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:plo:pone00:0267976. See general information about how to correct material in RePEc.

      If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

      If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

      If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

      For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: plosone (email available below). General contact details of provider: https://journals.plos.org/plosone/ .

      Please note that corrections may take a couple of weeks to filter through the various RePEc services.

      IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.