IDEAS home Printed from https://ideas.repec.org/a/plo/pone00/0315809.html
   My bibliography  Save this article

Multimodal feature fusion-based graph convolutional networks for Alzheimer’s disease stage classification using F-18 florbetaben brain PET images and clinical indicators

Author

Listed:
  • Gyu-Bin Lee
  • Young-Jin Jeong
  • Do-Young Kang
  • Hyun-Jin Yun
  • Min Yoon

Abstract

Alzheimer’s disease (AD), the most prevalent degenerative brain disease associated with dementia, requires early diagnosis to alleviate worsening of symptoms through appropriate management and treatment. Recent studies on AD stage classification are increasingly using multimodal data. However, few studies have applied graph neural networks to multimodal data comprising F-18 florbetaben (FBB) amyloid brain positron emission tomography (PET) images and clinical indicators. The objective of this study was to demonstrate the effectiveness of graph convolutional network (GCN) for AD stage classification using multimodal data, specifically FBB PET images and clinical indicators, collected from Dong-A University Hospital (DAUH) and Alzheimer’s Disease Neuroimaging Initiative (ADNI). The effectiveness of GCN was demonstrated through comparisons with the support vector machine, random forest, and multilayer perceptron across four classification tasks (normal control (NC) vs. AD, NC vs. mild cognitive impairment (MCI), MCI vs. AD, and NC vs. MCI vs. AD). As input, all models received the same combined feature vectors, created by concatenating the PET imaging feature vectors extracted by the 3D dense convolutional network and non-imaging feature vectors consisting of clinical indicators using multimodal feature fusion method. An adjacency matrix for the population graph was constructed using cosine similarity or the Euclidean distance between subjects’ PET imaging feature vectors and/or non-imaging feature vectors. The usage ratio of these different modal data and edge assignment threshold were tuned by setting them as hyperparameters. In this study, GCN-CS-com and GCN-ED-com were the GCN models that received the adjacency matrix constructed using cosine similarity (CS) and the Euclidean distance (ED) between the subjects’ PET imaging feature vectors and non-imaging feature vectors, respectively. In modified nested cross validation, GCN-CS-com and GCN-ED-com respectively achieved average test accuracies of 98.40%, 94.58%, 94.01%, 82.63% and 99.68%, 93.82%, 93.88%, 90.43% for the four aforementioned classification tasks using DAUH dataset, outperforming the other models. Furthermore, GCN-CS-com and GCN-ED-com respectively achieved average test accuracies of 76.16% and 90.11% for NC vs. MCI vs. AD classification using ADNI dataset, outperforming the other models. These results demonstrate that GCN could be an effective model for AD stage classification using multimodal data.

Suggested Citation

  • Gyu-Bin Lee & Young-Jin Jeong & Do-Young Kang & Hyun-Jin Yun & Min Yoon, 2024. "Multimodal feature fusion-based graph convolutional networks for Alzheimer’s disease stage classification using F-18 florbetaben brain PET images and clinical indicators," PLOS ONE, Public Library of Science, vol. 19(12), pages 1-23, December.
  • Handle: RePEc:plo:pone00:0315809
    DOI: 10.1371/journal.pone.0315809
    as

    Download full text from publisher

    File URL: https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0315809
    Download Restriction: no

    File URL: https://journals.plos.org/plosone/article/file?id=10.1371/journal.pone.0315809&type=printable
    Download Restriction: no

    File URL: https://libkey.io/10.1371/journal.pone.0315809?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:plo:pone00:0315809. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: plosone (email available below). General contact details of provider: https://journals.plos.org/plosone/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.