Author
Listed:
- Zhangjing Zheng
- Xixia Huang
- Le Wang
Abstract
At present, there are some problems in underwater low light image, such as low contrast, blurred details, color distortion. In the process of low illumination image enhancement, there are often problems such as artifacts, loss of edge details and noise amplification in the enhanced image. In this paper, we propose an underwater low-light enhancement algorithm based on U-shaped generative adversarial network, combined with bright channel prior and attention mechanism, to address the problems. For the problems of uneven edges and loss of details that occurred in traditional enhanced images, we propose a two-channel fusion technique for the input channel. Aiming at the problems of brightness, texture and color distortion in enhanced images, we propose a feature extraction technique based on the attention mechanism. For the problems of noise in enhanced output images, we propose a multi-loss function to constrain the network. The method has a wide range of applications in underwater scenes with large depth. This method can be used for target detection or biological species identification in underwater low light environment. Through the enhancement experiment of underwater low light image, the proposed method effectively solves the problems of low contrast, blurred details, color distortion, etc. of underwater low light image. Finally, we performed extensive comparison experiments and completed ablation experiments on the proposed method. The experimental results show that the proposed method is optimal in human visual experience and underwater image quality evaluation index.
Suggested Citation
Zhangjing Zheng & Xixia Huang & Le Wang, 2023.
"Underwater low-light enhancement network based on bright channel prior and attention mechanism,"
PLOS ONE, Public Library of Science, vol. 18(2), pages 1-18, February.
Handle:
RePEc:plo:pone00:0281093
DOI: 10.1371/journal.pone.0281093
Download full text from publisher
Corrections
All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:plo:pone00:0281093. See general information about how to correct material in RePEc.
If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.
We have no bibliographic references for this item. You can help adding them by using this form .
If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: plosone (email available below). General contact details of provider: https://journals.plos.org/plosone/ .
Please note that corrections may take a couple of weeks to filter through
the various RePEc services.