IDEAS home Printed from https://ideas.repec.org/a/asi/joasrj/v15y2025i3p510-524id5596.html
   My bibliography  Save this article

A comprehensive assessment of deep learning techniques for eye gaze estimation: A comparative performance analysis

Author

Listed:
  • Hao Xu

  • Masitah Ghazali

  • Nur Zuraifah Syazrah Othman

Abstract

The study aims to transform Convolutional Neural Networks (CNNs) for eye gaze estimation and prediction, providing relevant data on the limitations of traditional gaze tracking systems, which are often constrained by limited environments and expensive equipment. The authors propose a dual-task approach, where gaze estimation and gaze prediction are separated to enable more granular improvements in each process. Using the MPII Gaze dataset, collected under real-life conditions, various CNN architectures such as YOLOv3, SSD, and Mask R-CNN are evaluated and compared based on accuracy, precision, recall, and F1-measure. Each unique spatiotemporal sequence of eye images is utilized to enhance the predictive power of individual frames, allowing the model to identify temporal patterns and improve estimation through gaze continuity. Additional measures to increase model robustness and responsiveness include image normalization, region-of-interest extraction during preprocessing, and a geometric features-based blink detection mechanism. The results demonstrate that deep learning models can effectively improve gaze estimation accuracy under varying lighting conditions, head movements, and user diversity. This makes the technology applicable in fields such as education, medicine, automotive safety, adaptation, assistive technologies, and human-computer interaction. Overall, this research contributes to the development of scalable, adaptable, and precise gaze-tracking algorithms utilizing state-of-the-art automated learning methods, offering valuable insights for researchers in the field.

Suggested Citation

  • Hao Xu & Masitah Ghazali & Nur Zuraifah Syazrah Othman, 2025. "A comprehensive assessment of deep learning techniques for eye gaze estimation: A comparative performance analysis," Journal of Asian Scientific Research, Asian Economic and Social Society, vol. 15(3), pages 510-524.
  • Handle: RePEc:asi:joasrj:v:15:y:2025:i:3:p:510-524:id:5596
    as

    Download full text from publisher

    File URL: https://archive.aessweb.com/index.php/5003/article/view/5596/8434
    Download Restriction: no
    ---><---

    More about this item

    Keywords

    ;
    ;
    ;
    ;
    ;
    ;
    ;

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:asi:joasrj:v:15:y:2025:i:3:p:510-524:id:5596. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Robert Allen (email available below). General contact details of provider: https://archive.aessweb.com/index.php/5003/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.