GEAR: Gaze-enabled augmented reality for human activity recognition

In

2023 Symposium on Eye Tracking Research and Applications (ETRA ’23)

Conference

Date

May 30, 2023

Authors

Kenan Bektaş, Jannis Strecker, Simon Mayer, Kimberly Garcia, Jonas Hermann, Kay Erik Jenss, Yasmine Sheila Antille, and Marc Elias Solèr

Abstract

Head-mounted Augmented Reality (AR) displays overlay digital information on physical objects. Through eye tracking, they allow novel interaction methods and provide insights into user attention, intentions, and activities. However, only few studies have used gaze-enabled AR displays for human activity recognition (HAR). In an experimental study, we collected gaze data from 10 users on a HoloLens 2 (HL2) while they performed three activities (i.e., read, inspect, search). We trained machine learning models (SVM, Random Forest, Extremely Randomized Trees) with extracted features and achieved an up to 98.7% activity-recognition accuracy. On the HL2, we provided users with an AR feedback that is relevant to their current activity. We present the components of our system (GEAR) including a novel solution to enable the controlled sharing of collected data. We provide the scripts and anonymized datasets which can be used as teaching material in graduate courses or for reproducing our findings.

Text Reference

Kenan Bektaş, Jannis Strecker, Simon Mayer, Kimberly Garcia, Jonas Hermann, Kay Erik Jenss, Yasmine Sheila Antille, and Marc Elias Solèr. 2023. GEAR: Gaze-enabled augmented reality for human activity recognition. In 2023 Symposium on Eye Tracking Research and Applications (ETRA ’23), May 30–June 02, 2023, Tubingen, Germany. ACM, New York, NY, USA, 9 pages. https://doi.org/10.1145/3588015.3588402

BibTex Reference
@inproceedings{10.1145/3588015.3588402,
author = {Bekta\c{s}, Kenan and Strecker, Jannis and Mayer, Simon and Garcia, Dr. Kimberly and Hermann, Jonas and Jen\ss{}, Kay Erik and Antille, Yasmine Sheila and Sol\`{e}r, Marc},
title = {GEAR: Gaze-enabled augmented reality for human activity recognition},
year = {2023},
isbn = {9798400701504},
publisher = {Association for Computing Machinery},
address = {New York, NY, USA},
url = {https://doi.org/10.1145/3588015.3588402},
doi = {10.1145/3588015.3588402},
abstract = {Head-mounted Augmented Reality (AR) displays overlay digital information on physical objects. Through eye tracking, they allow novel interaction methods and provide insights into user attention, intentions, and activities. However, only few studies have used gaze-enabled AR displays for human activity recognition (HAR). In an experimental study, we collected gaze data from 10 users on a HoloLens 2 (HL2) while they performed three activities (i.e., read, inspect, search). We trained machine learning models (SVM, Random Forest, Extremely Randomized Trees) with extracted features and achieved an up to 98.7\% activity-recognition accuracy. On the HL2, we provided users with an AR feedback that is relevant to their current activity. We present the components of our system (GEAR) including a novel solution to enable the controlled sharing of collected data. We provide the scripts and anonymized datasets which can be used as teaching material in graduate courses or for reproducing our findings.},
booktitle = {Proceedings of the 2023 Symposium on Eye Tracking Research and Applications},
articleno = {9},
numpages = {9},
keywords = {attention, augmented reality, context-awareness, human activity recognition, pervasive eye tracking},
location = {Tubingen, Germany},
series = {ETRA '23}
}
Demo Video
Link to Published Paper Download Paper Link to Code
See all publications