EgoPressure

A Dataset for Hand Pressure and Pose Estimation in Egocentric Vision

CVPR 2025
Yiming Zhao*, Taein Kwon*, Paul Streli*, Marc Pollefeys, and Christian Holz
Department of Computer Science, ETH Zürich*equal contribution
EgoPressure teaser image

EgoPressure dataset. We introduce a novel egocentric pressure dataset with hand poses. We label hand poses using our proposed optimization method across all static camera views (Cameras 1–7). The annotated hand mesh aligns well with the egocentric camera’s view, indicating the high fidelity of our annotations. We project the pressure intensity and annotated hand mesh (Fig. i) to all camera views (Fig. a to h), and further provide the pressure applied over the hand as a UV texture map (Fig. j and k).

Abstract

Touch contact and pressure are essential for understanding how humans interact with and manipulate objects, insights which can significantly benefit applications in mixed reality and robotics. However, estimating these interactions from an egocentric camera perspective is challenging, largely due to the lack of comprehensive datasets that provide both accurate hand poses on contacting surfaces and detailed annotations of pressure information. In this paper, we introduce EgoPressure, a novel egocentric dataset that captures detailed touch contact and pressure interactions. EgoPressure provides high-resolution pressure intensity annotations for each contact point and includes accurate hand pose meshes obtained through our proposed multi-view, sequence-based optimization method processing data from an 8-camera capture rig. Our dataset comprises 5 hours of recorded interactions from 21 participants captured simultaneously by one head-mounted and seven stationary Kinect cameras, which acquire RGB images and depth maps at 30 Hz. To support future research and benchmarking, we present several baseline models for estimating applied pressure on external surfaces from RGB images, with and without hand pose information. We further explore the joint estimation of the hand mesh and applied pressure. Our experiments demonstrate that pressure and hand pose are complementary for understanding hand-object interactions.

Reference

Yiming Zhao*, Taein Kwon*, Paul Streli*, Marc Pollefeys, and Christian Holz. EgoPressure: A Dataset for Hand Pressure and Pose Estimation in Egocentric Vision. In Conference on Computer Vision and Pattern Recognition 2025 (CVPR).