Author
Abstract
Modern precise clinical decision-making requires visual evidence which shows both body structures as well as functional systems simultaneously. The combination of CT with MRI along with PET and ultrasound imaging and their related variants produces a deeper assessment for diagnostics beyond the capabilities of standalone scans. Recent advancements in machine learning reveal how convolutional and encoder–decoder and transformer systems cooperate to create high-quality multichannel images that benefit cancer margin identification along with brain injury diagnosis and cardiovascular system evaluation and urgent treatment applications. Standardizing input data occurs through critical preprocessing stages which include registration, intensity normalization and noise suppression and these processes simultaneously operate with geometric, elastic and GAN‑based augmentations to handle data scarcity. The results of BraTS and CHAOS benchmark testing along with other benchmarks demonstrate increased success in identifying boundaries together with lesion recognition through evaluation measures IoU, Dice and Hausdorff distance. A comprehensive framework unites CNN, U‑Net, GAN along with Vision Transformer modules through systematic attention protocols and synthetic data creation pathways for achieving secure real-time medical navigation systems as well as precise radiotherapy tools. The benchmarking process demonstrated traditional wavelet methods yield improved Dice scores between 7 and 12% on average but radiologists approved of the system with better lesion boundary detection. Deep-learning-driven fusion builds its core position in next-generation diagnostic imaging based on the combination of survey analysis and proposed innovations.
Suggested Citation
Chaitanya Krishna Kasaraneni & Keerthi Guttikonda & Revanth Madamala, 2025.
"Multi-modality Medical (CT, MRI, Ultrasound Etc.) Image Fusion Using Machine Learning/Deep Learning,"
Springer Series in Reliability Engineering,,
Springer.
Handle:
RePEc:spr:ssrchp:978-3-031-98728-1_16
DOI: 10.1007/978-3-031-98728-1_16
Download full text from publisher
To our knowledge, this item is not available for
download. To find whether it is available, there are three
options:
1. Check below whether another version of this item is available online.
2. Check on the provider's
web page
whether it is in fact available.
3. Perform a
for a similarly titled item that would be
available.
Corrections
All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:ssrchp:978-3-031-98728-1_16. See general information about how to correct material in RePEc.
If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.
We have no bibliographic references for this item. You can help adding them by using this form .
If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .
Please note that corrections may take a couple of weeks to filter through
the various RePEc services.