This thesis contributes to Human-Computer Interaction (HCI) research and technology development with a focus on the design of locomotion interfaces for virtual reality (VR) applications. Using handheld locomotion interfaces does not provide embodied (vestibular and proprioceptive) self-motion cues, which are associated with disorientation and motion sickness. Therefore, previous research designed and investigated a wide range of embodied locomotion interfaces, which provide partial embodied self-motion cues. However, prior research often investigated embodied locomotion interfaces in only one specific task in terms of a small subset of locomotion-relevant aspects and showed their advantages (e.g., believability) with some disadvantages (e.g., effectiveness) compared to the handheld interfaces. We investigate if providing embodied self-motion cues can reduce adverse effects of using handheld interfaces while improving or at least matching other user experience and performance measures in a wide range of locomotion scenarios. Using four user studies, we designed and step-by-step refined a leaning-based interface - called HeadJoystick. HeadJoystick users sit on a regular office swivel chair and rotate it physically to control virtual rotation and control the simulated translation by moving their head toward the target direction. We conducted eight user studies to thoroughly evaluated HeadJoystick and its standing version (NaviBoard) versus handheld interfaces in a wide range of 2D (ground-based) and 3D (flying) locomotion scenarios. Our results showed that providing embodied self-motion cues using carefully optimized embodied interfaces (HeadJoystick and NaviBoard) significantly improved all performance measures and reduced adverse effects of using handheld interfaces in terms of unconvincing simulated motion, motion sickness, disorientation while also improving or at least matching all other locomotion-relevant measures. In addition, our results showed that these benefits were more pronounced if provided with 360◦ physical (instead of partial or virtual) rotation; repeated (instead of short-term) interface usage; in multitasking (instead of locomotion-only) scenarios; standing/stepping (instead of sitting) body posture; and increased locomotion difficulty (speed). From a theoretical perspective, our findings help researchers by extending our knowledge about the effects of providing vestibular and proprioceptive self-motion cues on VR locomotion. From an applied perspective, we also suggest design guidelines to user interface designers for improving user experience, usability, and performance of VR locomotion interfaces.
Copyright is held by the author(s).
This thesis may be printed or downloaded for non-commercial research and scholarly purposes.
Supervisor or Senior Supervisor
Thesis advisor: Riecke, Bernhard
Member of collection