Small Cues, Big Differences: Evaluating Interaction and Presentation for Annotation Retrieval in AR
Zahra Borhani, Ali Ebrahimpour-Boroojeny, Francisco R. Ortega
公開日: 2025/9/14
Abstract
Augmented Reality (AR) enables intuitive interaction with virtual annotations overlaid on the real world, supporting a wide range of applications such as remote assistance, education, and industrial training. However, as the number of heterogeneous annotations increases, their efficient retrieval remains an open challenge in 3D environments. This paper examines how interaction modalities and presentation designs affect user performance, workload, fatigue, and preference in AR annotation retrieval. In two user studies, we compare eye-gaze versus hand-ray hovering and evaluate four presentation methods: Opacity-based, Scale-based, Nothing-based, and Marker-based. Results show that eye-gaze was favored over hand-ray by users, despite leading to significantly higher unintentional activations. Among the presentation methods, Scale-based presentation reduces workload and task completion time while aligning with user preferences. Our findings offer empirical insights into the effectiveness of different annotation presentation methods, leading to design recommendations for building more efficient and user-friendly AR annotation review systems.