Eye Tracking: How to Capture and Interpret User’s Point of View and Operator’s Gaze Strategies
Posted June 09, 2025
The Perception and Performance Technical Group (PPTG) is recognized as TG of the Month for June!
To stimulate knowledge exchange and discussions among peers, the Performance and Perception Technical Group hosted a webinar featuring Dr. Sampath Jayarathna at Old Dominion University and Ir. Rutger Stuut at the Ministry of Infrastructure and Water Management, to discuss how eye tracking can be used across domains, with an emphasis on how to capture and interpret the user's point of view and operator’s gaze strategies.
As eye tracking technologies become more accessible and provide higher quality data, they are increasingly being used to identify new nuances in human performance. Gaze patterns and dwell times can be recorded with these technologies, which give insights into how people allocate their attention. This has many uses - for example, identifying how experts take in information differently from novices, which products consumers are drawn on a cluttered store shelf, or where drivers look for safety-relevant information. Eye tracking can provide a useful window to explore how human observers process visual information in applied scenarios and tasks using relevant ocular metrics. However, the proper use of eye tracking technique requires us to appreciate benefits and constraints in collecting and interpreting such oculophysiological data collected in modern eye tracking technologies. The webinar discussed the potential value of eye tracking research from two use case perspectives. Ir. Stuut discussed how eye tracking technologies can be used in controlled simulated environments to better understand gaze behavior strategies and how they relate to operator task performance. Dr. Jayarathna discussed the use case of advancing AR and AI by leveraging egocentric data collection through smart glasses.
About the speakers:
Dr. Sampath Jayarathna is an Associate Professor of Computer Science at Old Dominion University. He introduced Meta’s Project Aria, an ambitious initiative aimed at advancing AR and AI by leveraging egocentric data collection through smart glasses. These glasses are designed to capture and analyze data from the wearer’s perspective, enabling researchers to develop context-aware AR experiences and refine AI algorithms for spatial awareness, object recognition, and real-world interactions.
Ir. Rutger Stuut is a senior human factors advisor at the executive agency of the Ministry of Infrastructure and Water Management. He discussed an ongoing project aimed at improving bridge operators’ error management by identifying optimal gaze strategies of the closed-circuit television streams used in remote operation.
Perception and Performance Technical Group (PPTG)
The Perception and Performance Technical Group (PPTG) is committed to continuously supporting knowledge exchange related to various aspects of perception and performance research, bringing together the insights of a broad range of domains. Topics of interest to our members are (i) perception research technologies, such as eye tracking, (ii) perception support technologies, such as haptic feedback, (iii) human perception characteristics, such as visual impairment and auditory aspects of perception, (iv) types of information presentation to improve understandability of information and/or improve task performance, and (v) perception and performance research related to remote control in a broad range of domains, such as transportation, process control, healthcare and space mission control.
The PPTG aims to support its student members through various initiatives such as best student paper awards, offering student research proposal grants, and providing financial assistance for students to attend the annual meeting. As part of their commitment to knowledge sharing, the group hosts webinars and aims to provide opportunities to foster a vibrant community of peers in the field.
More information about PPTG can be found on our website.
TG Chair: Dr. ir. Ellemieke van Doorn (LinkedIn)