IDEAS home Printed from https://ideas.repec.org/a/nat/natcom/v11y2020i1d10.1038_s41467-020-19712-x.html
   My bibliography  Save this article

Detection of eye contact with deep neural networks is as accurate as human experts

Author

Listed:
  • Eunji Chong

    (Georgia Institute of Technology)

  • Elysha Clark-Whitney

    (Weill Cornell Medicine)

  • Audrey Southerland

    (Georgia Institute of Technology)

  • Elizabeth Stubbs

    (Georgia Institute of Technology)

  • Chanel Miller

    (Georgia Institute of Technology)

  • Eliana L. Ajodan

    (Weill Cornell Medicine)

  • Melanie R. Silverman

    (Weill Cornell Medicine)

  • Catherine Lord

    (University of California)

  • Agata Rozga

    (Georgia Institute of Technology)

  • Rebecca M. Jones

    (Weill Cornell Medicine)

  • James M. Rehg

    (Georgia Institute of Technology)

Abstract

Eye contact is among the most primary means of social communication used by humans. Quantification of eye contact is valuable as a part of the analysis of social roles and communication skills, and for clinical screening. Estimating a subject’s looking direction is a challenging task, but eye contact can be effectively captured by a wearable point-of-view camera which provides a unique viewpoint. While moments of eye contact from this viewpoint can be hand-coded, such a process tends to be laborious and subjective. In this work, we develop a deep neural network model to automatically detect eye contact in egocentric video. It is the first to achieve accuracy equivalent to that of human experts. We train a deep convolutional network using a dataset of 4,339,879 annotated images, consisting of 103 subjects with diverse demographic backgrounds. 57 subjects have a diagnosis of Autism Spectrum Disorder. The network achieves overall precision of 0.936 and recall of 0.943 on 18 validation subjects, and its performance is on par with 10 trained human coders with a mean precision 0.918 and recall 0.946. Our method will be instrumental in gaze behavior analysis by serving as a scalable, objective, and accessible tool for clinicians and researchers.

Suggested Citation

  • Eunji Chong & Elysha Clark-Whitney & Audrey Southerland & Elizabeth Stubbs & Chanel Miller & Eliana L. Ajodan & Melanie R. Silverman & Catherine Lord & Agata Rozga & Rebecca M. Jones & James M. Rehg, 2020. "Detection of eye contact with deep neural networks is as accurate as human experts," Nature Communications, Nature, vol. 11(1), pages 1-10, December.
  • Handle: RePEc:nat:natcom:v:11:y:2020:i:1:d:10.1038_s41467-020-19712-x
    DOI: 10.1038/s41467-020-19712-x
    as

    Download full text from publisher

    File URL: https://www.nature.com/articles/s41467-020-19712-x
    File Function: Abstract
    Download Restriction: no

    File URL: https://libkey.io/10.1038/s41467-020-19712-x?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Zijing Wu & Ce Zhang & Xiaowei Gu & Isla Duporge & Lacey F. Hughey & Jared A. Stabach & Andrew K. Skidmore & J. Grant C. Hopcraft & Stephen J. Lee & Peter M. Atkinson & Douglas J. McCauley & Richard L, 2023. "Deep learning enables satellite-based monitoring of large populations of terrestrial mammals across heterogeneous landscape," Nature Communications, Nature, vol. 14(1), pages 1-15, December.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:nat:natcom:v:11:y:2020:i:1:d:10.1038_s41467-020-19712-x. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.nature.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.