GlassBoARd: A Gaze-Enabled AR Interface for Collaborative Work

In

Extended Abstracts of the CHI Conference on Human Factors in Computing Systems (CHI EA ’24)

Conference

Date

May 11, 2024

Authors

Kenan Bektaş, Adrian Pandjaitan, Jannis Strecker, and Simon Mayer

Abstract

Recent research on remote collaboration focuses on improving the sense of co-presence and mutual understanding among the collaborators, whereas there is limited research on using non-verbal cues such as gaze or head direction alongside their main communication channel. Our system – GlassBoARd – permits collaborators to see each other’s gaze behavior and even make eye contact while communicating verbally and in writing. GlassBoARd features a transparent shared Augmented Reality interface that is situated in-between two users, allowing face-to-face collaboration. From the perspective of each user, the remote collaborator is represented as an avatar that is located behind the GlassBoARd and whose eye movements are contingent on the remote collaborator’s instant eye movements. In three iterations, we improved the design of GlassBoARd and tested it with two use cases. Our preliminary evaluations showed that GlassBoARd facilitates an environment for conducting future user experiments to study the effect of sharing eye gaze on the communication bandwidth.

Text Reference

Kenan Bektaş, Adrian Pandjaitan, Jannis Strecker, and Simon Mayer. 2024. GlassBoARd: A Gaze-Enabled AR Interface for Collaborative Work. In Extended Abstracts of the CHI Conference on Human Factors in Computing Systems (CHI EA ’24), May 11–16, 2024, Honolulu, HI, USA. ACM, New York, NY, USA, 8 pages. https://doi.org/10.1145/3613905.3650965

BibTex Reference
@inproceedings{10.1145/3613905.3650965,
author = {Bekta\c{s}, Kenan and Pandjaitan, Adrian and Strecker, Jannis and Mayer, Simon},
title = {GlassBoARd: A Gaze-Enabled AR Interface for Collaborative Work},
year = {2024},
isbn = {9798400703317},
publisher = {Association for Computing Machinery},
address = {New York, NY, USA},
url = {https://doi.org/10.1145/3613905.3650965},
doi = {10.1145/3613905.3650965},
abstract = {Recent research on remote collaboration focuses on improving the sense of co-presence and mutual understanding among the collaborators, whereas there is limited research on using non-verbal cues such as gaze or head direction alongside their main communication channel. Our system – GlassBoARd – permits collaborators to see each other’s gaze behavior and even make eye contact while communicating verbally and in writing. GlassBoARd features a transparent shared Augmented Reality interface that is situated in-between two users, allowing face-to-face collaboration. From the perspective of each user, the remote collaborator is represented as an avatar that is located behind the GlassBoARd and whose eye movements are contingent on the remote collaborator’s instant eye movements. In three iterations, we improved the design of GlassBoARd and tested it with two use cases. Our preliminary evaluations showed that GlassBoARd facilitates an environment for conducting future user experiments to study the effect of sharing eye gaze on the communication bandwidth.},
booktitle = {Extended Abstracts of the 2024 CHI Conference on Human Factors in Computing Systems},
articleno = {181},
numpages = {8},
keywords = {CSCW, augmented reality, eye tracking, gaze, non-verbal cues, presence, remote collaboration},
location = {Honolulu, HI, USA},
series = {CHI EA '24}
}
Teaser Video
Link to Published Paper Download Paper Link to Code
See all publications