NormalTouch and TextureTouch

High-fidelity 3D Haptic Shape Rendering on Handheld Virtual Reality Controllers

ACM UIST 2016Honorable mention award
Hrvoje Benko, Christian Holz, Mike Sinclair, and Eyal Ofek
NormalTouch and TextureTouch teaser image

Abstract

We present an investigation of mechanically-actuated handheld controllers that render the shape of virtual objects through physical shape displacement, enabling users to feel 3D surfaces, textures, and forces that match the visual rendering. We demonstrate two such controllers, NormalTouch and TextureTouch. Both controllers are tracked with 6 DOF and produce spatially-registered haptic feedback to a user’s finger. NormalTouch haptically renders object surfaces and provides force feedback using a tiltable and extrudable platform. TextureTouch renders the shape of virtual objects including detailed surface structure through a 4×4 matrix of actuated pins. By moving our controllers around in space while keeping their finger on the actuated platform, users obtain the impression of a much larger 3D shape by cognitively integrating output sensations over time. Our evaluation compares the effectiveness of our controllers with the two defacto standards in Virtual Reality controllers: device vibration and visual feedback only. We find that haptic feedback significantly increases the accuracy of VR interaction, most effectively by rendering high-fidelity shape output as in the case of our controllers. Participants also generally found NormalTouch and TextureTouch realistic in conveying the sense of touch for a variety of 3D objects.

Video

Reference

Hrvoje Benko, Christian Holz, Mike Sinclair, and Eyal Ofek. NormalTouch and TextureTouch: High-fidelity 3D Haptic Shape Rendering on Handheld Virtual Reality Controllers. In Proceedings of ACM UIST 2016.