IDEAS home Printed from https://ideas.repec.org/a/plo/pone00/0323536.html
   My bibliography  Save this article

Image based fog density estimation

Author

Listed:
  • Mingrui Dai
  • Weifeng Shi
  • Guohua Li

Abstract

Although the application of image-based fog density estimation brings excellent convenience and low-cost methods, the accuracy of such methods still needs to be improved, and further research is encouraged on accuracy evaluation methods. To improve the accuracy and computational efficiency of fog density estimation in images, we first construct three image features based on the image dark channel information, the image saturation information, and the proportion of gray noise points, respectively. Then, we use a feature fusion method to estimate fog density in the images. In addition, two indicators have been constructed to evaluate the accuracy of various fog density estimation methods. These two indicators are the sequential error indicator and the proportional error indicator, which are calculated using fog image sequences with known density values. These two new indicators enable the evaluation of any fog density estimation method in terms of the ability to maintain order and ratio values. The experimental results show that the proposed method can effectively estimate the fog densities of images and display the best performance among the eight latest image-based methods for estimating fog density; the three features used in the proposed method significantly impact the effectiveness of image-based fog density estimation. The proposed method has been illustrated for fog density analysis of indoor and outdoor surveillance videos. The source code is available at https://github.com/Dai-MR/ImageFogDensityEsitmation.

Suggested Citation

  • Mingrui Dai & Weifeng Shi & Guohua Li, 2025. "Image based fog density estimation," PLOS ONE, Public Library of Science, vol. 20(6), pages 1-21, June.
  • Handle: RePEc:plo:pone00:0323536
    DOI: 10.1371/journal.pone.0323536
    as

    Download full text from publisher

    File URL: https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0323536
    Download Restriction: no

    File URL: https://journals.plos.org/plosone/article/file?id=10.1371/journal.pone.0323536&type=printable
    Download Restriction: no

    File URL: https://libkey.io/10.1371/journal.pone.0323536?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:plo:pone00:0323536. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: plosone (email available below). General contact details of provider: https://journals.plos.org/plosone/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.