IDEAS home Printed from https://ideas.repec.org/a/gam/jmathe/v12y2024i5p689-d1346985.html
   My bibliography  Save this article

Deciphering the Efficacy of No-Attention Architectures in Computed Tomography Image Classification: A Paradigm Shift

Author

Listed:
  • Salha M. Alzahrani

    (Department of Computer Science, College of Computers and Information Technology, Taif University, Taif 21944, Saudi Arabia)

Abstract

The burgeoning domain of medical imaging has witnessed a paradigm shift with the integration of AI, particularly deep learning, enhancing diagnostic precision and expediting the analysis of Computed Tomography (CT) images. This study introduces an innovative Multilayer Perceptron-driven model, DiagnosticMLP, which sidesteps the computational intensity of attention-based mechanisms, favoring a no-attention architecture that leverages Fourier Transforms for global information capture and spatial gating units for local feature emphasis. This study’s methodology encompasses a sophisticated augmentation and patching strategy at the input level, followed by a series of MLP blocks designed to extract hierarchical features and spatial relationships, culminating in a global average pooling layer before classification. Evaluated against state-of-the-art MLP-based models including MLP-Mixer, FNet, gMLP, and ResMLP across diverse and extensive CT datasets, including abdominal, and chest scans, DiagnosticMLP demonstrated a remarkable ability to converge efficiently, with competitive accuracy, F1 scores, and AUC metrics. Notably, in datasets featuring kidney and abdomen disorders, the model showcased superior generalization capabilities, underpinned by its unique design that addresses the complexity inherent in CT imaging. The findings in terms of accuracy and precision-recall balance posit DiagnosticMLP as an exceptional outperforming alternative to attention-reliant models, paving the way for streamlined, efficient, and scalable AI tools in medical diagnostics, reinforcing the potential for AI-augmented precision medicine without the dependency on attention-based architectures.

Suggested Citation

  • Salha M. Alzahrani, 2024. "Deciphering the Efficacy of No-Attention Architectures in Computed Tomography Image Classification: A Paradigm Shift," Mathematics, MDPI, vol. 12(5), pages 1-27, February.
  • Handle: RePEc:gam:jmathe:v:12:y:2024:i:5:p:689-:d:1346985
    as

    Download full text from publisher

    File URL: https://www.mdpi.com/2227-7390/12/5/689/pdf
    Download Restriction: no

    File URL: https://www.mdpi.com/2227-7390/12/5/689/
    Download Restriction: no
    ---><---

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:gam:jmathe:v:12:y:2024:i:5:p:689-:d:1346985. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: MDPI Indexing Manager (email available below). General contact details of provider: https://www.mdpi.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.