Author
Listed:
- Jamal Zraqou
- Riyad Alrousan
- Bilal Sowan
- Jawad Alkhatib
Abstract
This research tends to solve the information bottleneck challenge in vision transformer-based solutions for image super-resolution, where the intensity of the feature map reduces in deeper network layers, thus affecting model performance. LITRL, the Layer-Interconnected Transformer with Residual Links, provides stability to the information flow by means of the dense residual connections between the layers, with the aim of preventing spatial information loss. The methodology involves the integration of the Swin transformer architecture and new schemes of interconnections to maintain vital spatial features in the whole network. Experimental results show that the LITRL-based method gives better results on traditional benchmark datasets (Set5, Set14, BSD100, Urban100, Manga109), in terms of quantitative (PSNR, SSIM) and qualitative evaluation. At 4×, LITRL obtains PSNR/SSIM of 40.37/0.9628 on Set 5 and 35.70/0.9408 on Urban100 with far higher performance than comparable methods. The proposed LITRL model dramatically reduces the information bottleneck of transformer-based super-resolution. It retains fundamental spatial information due to the dense-residual connections, giving rise to sharper images with more natural textures and fewer artefacts. Practical Implications: The excellent performance of LITRL in generating complex textures and structures that, in turn, enables accurate reconstruction, makes the method particularly useful for the tasks where the retention of a high level of fidelity of the image enhancement is imperative, i.e., for medical imaging, analysis of satellite images, and developing digital content while requiring a reasonable computational efficiency.
Suggested Citation
Jamal Zraqou & Riyad Alrousan & Bilal Sowan & Jawad Alkhatib, 2025.
"Utilizing image super-resolution to overcome information bottlenecks in vision transformers,"
International Journal of Innovative Research and Scientific Studies, Innovative Research Publishing, vol. 8(3), pages 3734-3749.
Handle:
RePEc:aac:ijirss:v:8:y:2025:i:3:p:3734-3749:id:7383
Download full text from publisher
Corrections
All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:aac:ijirss:v:8:y:2025:i:3:p:3734-3749:id:7383. See general information about how to correct material in RePEc.
If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.
We have no bibliographic references for this item. You can help adding them by using this form .
If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Natalie Jean (email available below). General contact details of provider: https://ijirss.com/index.php/ijirss/ .
Please note that corrections may take a couple of weeks to filter through
the various RePEc services.