Hand Out-Of-View Tracking for Proprioceptive Interaction Using Inertial Sensing

ACM CHI 2023
HOOV teaser image

HOOV is a wireless sensing method that complements existing virtual and augmented reality headsets to support hand tracking outside the field of view of the headset’s cameras. In this example app of our system, a participant is building a structure composed of blocks with different sizes. When the user places an object onto the structure, his hand is tracked by the headset’s cameras (left). As the user reaches for a block just to his right (right), his hand leaves the field of view of the cameras. Now, HOOV continuously estimates the 6D position and orientation of the wrist using the 6-axis inertial measuring unit attached to the wrist during the time the hand stays outside the field of view.


Current Virtual Reality systems are designed for interaction under visual control. Using built-in cameras, headsets track the user’s hands or hand-held controllers while they are inside the field of view. Current systems thus ignore the user’s interaction with off-screen content—virtual objects that the user could quickly access through proprioception without requiring laborious head motions to bring them into focus. In this paper, we present HOOV, a wrist-worn sensing method that allows VR users to interact with objects outside their field of view. Based on the signals of a single wrist-worn inertial sensor, HOOV continuously estimates the user’s hand position in 3-space to complement the headset’s tracking as the hands leave the tracking range. Our novel data-driven method predicts hand positions and trajectories from just the continuous estimation of hand orientation, which by itself is stable based solely on inertial observations. Our inertial sensing simultaneously detects finger pinching to register off-screen selection events, confirms them using a haptic actuator inside our wrist device, and thus allows users to select, grab, and drop virtual content. We compared HOOV’s performance with a camera-based optical motion capture system in two folds. In the first evaluation, participants interacted based on tracking information from the motion capture system to assess the accuracy of their proprioceptive input, whereas in the second, they interacted based on HOOV’s real-time estimations. We found that HOOV’s target-agnostic estimations had a mean tracking error of 7.7 cm, which allowed participants to reliably access virtual objects around their body without first bringing them into focus. We demonstrate several applications that leverage the larger input space HOOV opens up for quick proprioceptive interaction, and conclude by discussing the potential of our technique.



Paul Streli, Rayan Armani, Yi Fei Cheng, and Christian Holz. HOOV: Hand Out-Of-View Tracking for Proprioceptive Interaction Using Inertial Sensing. In Proceedings of ACM CHI 2023.