IDEAS home Printed from https://ideas.repec.org/a/gam/jmathe/v9y2021i16p1934-d614052.html
   My bibliography  Save this article

Multimodal Human Recognition in Significantly Low Illumination Environment Using Modified EnlightenGAN

Author

Listed:
  • Ja Hyung Koo

    (Division of Electronics and Electrical Engineering, Dongguk University, 30 Pildong-ro 1-gil, Jung-gu, Seoul 04620, Korea)

  • Se Woon Cho

    (Division of Electronics and Electrical Engineering, Dongguk University, 30 Pildong-ro 1-gil, Jung-gu, Seoul 04620, Korea)

  • Na Rae Baek

    (Division of Electronics and Electrical Engineering, Dongguk University, 30 Pildong-ro 1-gil, Jung-gu, Seoul 04620, Korea)

  • Kang Ryoung Park

    (Division of Electronics and Electrical Engineering, Dongguk University, 30 Pildong-ro 1-gil, Jung-gu, Seoul 04620, Korea)

Abstract

Human recognition in indoor environments occurs both during the day and at night. During the day, human recognition encounters performance degradation owing to a blur generated when a camera captures a person’s image. However, when images are captured at night with a camera, it is difficult to obtain perfect images of a person without light, and the input images are very noisy owing to the properties of camera sensors in low-illumination environments. Studies have been conducted in the past on face recognition in low-illumination environments; however, there is lack of research on face- and body-based human recognition in very low illumination environments. To solve these problems, this study proposes a modified enlighten generative adversarial network (modified EnlightenGAN) in which a very low illumination image is converted to a normal illumination image, and the matching scores of deep convolutional neural network (CNN) features of the face and body in the converted image are combined with a score-level fusion for recognition. The two types of databases used in this study are the Dongguk face and body database version 3 (DFB-DB3) and the ChokePoint open dataset. The results of the experiment conducted using the two databases show that the human verification accuracy (equal error rate (ERR)) and identification accuracy (rank 1 genuine acceptance rate (GAR)) of the proposed method were 7.291% and 92.67% for DFB-DB3 and 10.59% and 87.78% for the ChokePoint dataset, respectively. Accordingly, the performance of the proposed method was better than the previous methods.

Suggested Citation

  • Ja Hyung Koo & Se Woon Cho & Na Rae Baek & Kang Ryoung Park, 2021. "Multimodal Human Recognition in Significantly Low Illumination Environment Using Modified EnlightenGAN," Mathematics, MDPI, vol. 9(16), pages 1-43, August.
  • Handle: RePEc:gam:jmathe:v:9:y:2021:i:16:p:1934-:d:614052
    as

    Download full text from publisher

    File URL: https://www.mdpi.com/2227-7390/9/16/1934/pdf
    Download Restriction: no

    File URL: https://www.mdpi.com/2227-7390/9/16/1934/
    Download Restriction: no
    ---><---

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Dat Tien Nguyen & Se Hyun Nam & Ganbayar Batchuluun & Muhammad Owais & Kang Ryoung Park, 2022. "An Ensemble Classification Method for Brain Tumor Images Using Small Training Data," Mathematics, MDPI, vol. 10(23), pages 1-30, December.
    2. Ja Hyung Koo & Se Woon Cho & Na Rae Baek & Young Won Lee & Kang Ryoung Park, 2022. "A Survey on Face and Body Based Human Recognition Robust to Image Blurring and Low Illumination," Mathematics, MDPI, vol. 10(9), pages 1-15, May.
    3. Tuyen Danh Pham & Young Won Lee & Chanhum Park & Kang Ryoung Park, 2022. "Deep Learning-Based Detection of Fake Multinational Banknotes in a Cross-Dataset Environment Utilizing Smartphone Cameras for Assisting Visually Impaired Individuals," Mathematics, MDPI, vol. 10(9), pages 1-27, May.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:gam:jmathe:v:9:y:2021:i:16:p:1934-:d:614052. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: MDPI Indexing Manager (email available below). General contact details of provider: https://www.mdpi.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.