SIPLAB: Sensing, Interaction & Perception Lab

In the SIPLAB at ETH Zürich, we're a cross-disciplinary research group working on computational interaction, physical computing, and mobile health. Our work straddles human-computer interaction, virtual reality, wearable devices, and applied machine learning. We are part of the Department of Computer Science (D-INFK) and affiliated with the Department of Information Technology and Electrical Engineering (D-ITET).
situated virtual reality VR in the real world
sensing embodied input for interaction with devices
predictive mobile health signal processing for physiological data

Projects to appear and recent research

  • ACM UIST 2021. TouchPose: Hand Pose Prediction, Depth Estimation, and Touch Classification from Capacitive Images.
  • ACM UIST 2021. SoundsRide: Affordance-Synchronized Music Mixing for In-Car Audio Augmented Reality.
  • ACM UIST 2021. AirConstellations: In-Air Device Formations for Cross-Device Interaction via Multiple Spatially-Aware Armatures.
  • ACM IMWUT 2021. Smartphone-Based Tapping Frequency as a Surrogate for Perceived Fatigue. An In-the-Wild Feasibility Study in Multiple Sclerosis Patients.
  • ACM IMWUT 2021. CoolMoves: User Motion Accentuation in Virtual Reality.
  • ACM CHI 2021. CapContact: Super-resolution Contact Areas from Capacitive Touchscreens. (Best paper award)

In our projects, we also collaborate with faculty and students at the Department of Health Sciences and Technology (D-HEST), the Department of Mechanical and Process Engineering (D-MAVT), and the Faculty of Medicine at University of Zurich. We are also part of ETH's Competence Centre for Rehabilitation Engineering and Science.