IDEAS home Printed from https://ideas.repec.org/a/prg/jnlaip/v2022y2022i2id180p205-217.html
   My bibliography  Save this article

Gender Recognition Based on Hand Thermal Characteristic

Author

Listed:
  • Katerina Prihodova

Abstract

Automatic gender recognition is one of the frequently solved tasks in computer vision. It is useful for analysing human behaviour, intelligent monitoring or security. In this article, gender is recognized based on multispectral images of the hand. Hand (palm and back) images are obtained in the visible spectrum and thermal spectrum; then a fusion of images is performed. Some studies say that it is possible to distinguish male and female hands by some geometric features of the hand. The aim of this article is to determine whether it is possible to recognize gender by the thermal characteristics of the hand and, at the same time, to find the best architecture for this recognition. The article compares several algorithms that can be used to solve this issue. The convolutional neural network (CNN) AlexNet is used for feature extraction. The support vector machine, linear discriminant, naive Bayes classifier and neural networks were used for subsequent classification. Only CNNs were used for both extraction and subsequent classification. All of these methods lead to high accuracy of gender recognition. However, the most accurate are the convolutional neural networks VGG-16 and VGG-19. The accuracy of gender recognition (test data) is 94.9% for the palm and 89.9% for the back. Experiments in comparative studies have had promising results and shown that multispectral hand images (thermal and visible) can be useful in gender recognition.

Suggested Citation

  • Katerina Prihodova, 2022. "Gender Recognition Based on Hand Thermal Characteristic," Acta Informatica Pragensia, Prague University of Economics and Business, vol. 2022(2), pages 205-217.
  • Handle: RePEc:prg:jnlaip:v:2022:y:2022:i:2:id:180:p:205-217
    DOI: 10.18267/j.aip.180
    as

    Download full text from publisher

    File URL: http://aip.vse.cz/doi/10.18267/j.aip.180.html
    Download Restriction: free of charge

    File URL: http://aip.vse.cz/doi/10.18267/j.aip.180.pdf
    Download Restriction: free of charge

    File URL: https://libkey.io/10.18267/j.aip.180?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:prg:jnlaip:v:2022:y:2022:i:2:id:180:p:205-217. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Stanislav Vojir (email available below). General contact details of provider: https://edirc.repec.org/data/uevsecz.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.