Investigating Encoding and Perspective for Augmented Reality

Jade Kandel, Sriya Kasumarthi, Spiros Tsalikis, Chelsea Duppen, Daniel Szafir, Michael Lewek, Henry Fuchs, Danielle Szafir

公開日: 2025/10/1

Abstract

Augmented reality (AR) offers promising opportunities to support movement-based activities, such as personal training or physical therapy, with real-time, spatially-situated visual cues. While many approaches leverage AR to guide motion, existing design guidelines focus on simple, upper-body movements within the user's field of view. We lack evidence-based design recommendations for guiding more diverse scenarios involving movements with varying levels of visibility and direction. We conducted an experiment to investigate how different visual encodings and perspectives affect motion guidance performance and usability, using three exercises that varied in visibility and planes of motion. Our findings reveal significant differences in preference and performance across designs. Notably, the best perspective varied depending on motion visibility and showing more information about the overall motion did not necessarily improve motion execution. We provide empirically-grounded guidelines for designing immersive, interactive visualizations for motion guidance to support more effective AR systems.

Investigating Encoding and Perspective for Augmented Reality | SummarXiv | SummarXiv