IDEAS home Printed from https://ideas.repec.org/a/nat/natcom/v11y2020i1d10.1038_s41467-020-18360-5.html
   My bibliography  Save this article

Accelerating eye movement research via accurate and affordable smartphone eye tracking

Author

Listed:
  • Nachiappan Valliappan

    (Google Research)

  • Na Dai

    (Google Research)

  • Ethan Steinberg

    (Google Research
    Stanford University)

  • Junfeng He

    (Google Research)

  • Kantwon Rogers

    (Google Research
    Georgia Institute of Technology)

  • Venky Ramachandran

    (Google Research)

  • Pingmei Xu

    (Google Research)

  • Mina Shojaeizadeh

    (Google Research)

  • Li Guo

    (Google Research
    Johns Hopkins University)

  • Kai Kohlhoff

    (Google Research)

  • Vidhya Navalpakkam

    (Google Research)

Abstract

Eye tracking has been widely used for decades in vision research, language and usability. However, most prior research has focused on large desktop displays using specialized eye trackers that are expensive and cannot scale. Little is known about eye movement behavior on phones, despite their pervasiveness and large amount of time spent. We leverage machine learning to demonstrate accurate smartphone-based eye tracking without any additional hardware. We show that the accuracy of our method is comparable to state-of-the-art mobile eye trackers that are 100x more expensive. Using data from over 100 opted-in users, we replicate key findings from previous eye movement research on oculomotor tasks and saliency analyses during natural image viewing. In addition, we demonstrate the utility of smartphone-based gaze for detecting reading comprehension difficulty. Our results show the potential for scaling eye movement research by orders-of-magnitude to thousands of participants (with explicit consent), enabling advances in vision research, accessibility and healthcare.

Suggested Citation

  • Nachiappan Valliappan & Na Dai & Ethan Steinberg & Junfeng He & Kantwon Rogers & Venky Ramachandran & Pingmei Xu & Mina Shojaeizadeh & Li Guo & Kai Kohlhoff & Vidhya Navalpakkam, 2020. "Accelerating eye movement research via accurate and affordable smartphone eye tracking," Nature Communications, Nature, vol. 11(1), pages 1-12, December.
  • Handle: RePEc:nat:natcom:v:11:y:2020:i:1:d:10.1038_s41467-020-18360-5
    DOI: 10.1038/s41467-020-18360-5
    as

    Download full text from publisher

    File URL: https://www.nature.com/articles/s41467-020-18360-5
    File Function: Abstract
    Download Restriction: no

    File URL: https://libkey.io/10.1038/s41467-020-18360-5?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Xiaozhi Yang & Ian Krajbich, 2021. "Webcam-based online eye-tracking for behavioral research," Judgment and Decision Making, Society for Judgment and Decision Making, vol. 16(6), pages 1485-1505, November.
    2. Borozan, Miloš & Loreta, Cannito & Riccardo, Palumbo, 2022. "Eye-tracking for the study of financial decision-making: A systematic review of the literature," Journal of Behavioral and Experimental Finance, Elsevier, vol. 35(C).
    3. Fischbacher, Urs & Hausfeld, Jan & Renerte, Baiba, 2022. "Strategic incentives undermine gaze as a signal of prosocial motives," Games and Economic Behavior, Elsevier, vol. 136(C), pages 63-91.
    4. repec:cup:judgdm:v:16:y:2021:i:6:p:1485-1505 is not listed on IDEAS

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:nat:natcom:v:11:y:2020:i:1:d:10.1038_s41467-020-18360-5. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.nature.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.