The Effect of Spatial Auditory and Visual Cues in Mixed Reality Remote Collaboration
01 December 2020
Collaborative mixed reality (MR) technologies enable physically remote people to work together, by sharing the same communication cues used in face-to-face conversation such as gaze and hand gestures. While visual cues have been investigated by several collaborative MR systems, spatial auditory cues remain underexplored. In this paper, we present an MR remote collaboration system that shares both spatial auditory and visual cues. Through a user study in an unmodified 90m2 office, we found that compared to non-spatialized audio, the spatialized remote expert's voice and auditory beacons enabled local workers to locate occluded objects as small as 2cm3 with significantly stronger spatial perception. We further integrated the spatial auditory cues with visual head and hand representations of the remote expert. These hybrid cues complemented each other and significantly improved the local worker's task performance, co-presence experience, and spatial awareness.