EToS-1: Eye Tracking on Shopfloors for User Engagement with Automation
In
Workshop on Engaging with Automation co-located with CHI22
Date
April 27, 2022
Authors
Kenan Bektas, Jannis Strecker, Simon Mayer, and Markus Stolze
Abstract
Mixed Reality (MR) is becoming an integral part of many context-aware industrial applications. In maintenance and remote support operations, the individual steps of computer-supported (cooperative) work can be defined and presented to human operators through MR headsets. Tracking of eye movements can provide valuable insights into a user’s decision-making and interaction processes. Thus, our overarching goal is to better understand the visual inspection behavior of machine operators on shopfloors and to find ways to provide them with attention-aware and context-aware assistance through MR headsets that increasingly come with eye tracking (ET) as a default feature. Toward this goal, in two industrial scenarios, we used two mobile eye tracking devices and systematically compared the visual inspection behavior of novice and expert operators. In this paper we present our preliminary findings and lessons learned
Text Reference
Kenan Bektas, Jannis Strecker, Simon Mayer, and Markus Stolze. 2022. EToS-1: Eye Tracking on Shopfloors for User Engagement with Automation. In Proceedings of the Workshop on Engaging with Automation co-located with the ACM Conference on Human Factors in Computing Systems (CHI 2022), April 30, 2022, New Orleans, LA, USA. https://www.alexandria.unisg.ch/266339