SIPLAB: Sensing, Interaction & Perception Lab

In the SIPLAB at ETH Zürich, we're a cross-disciplinary research group working on computational interaction, physical computing, and mobile health. Our work straddles human-computer interaction, virtual reality, wearable devices, and applied machine learning. We are part of the Department of Computer Science (D-INFK) and affiliated with the Department of Information Technology and Electrical Engineering (D-ITET).
sensory augmentation for perceptive systems & human perception
embodied input sensing for natural device interaction
predictive mobile health signal processing for physiological data

Projects to appear and recent research

  • ACM UIST 2021. TouchPose: Hand Pose Prediction, Depth Estimation, and Touch Classification from Capacitive Images.
  • ACM UIST 2021. SoundsRide: Affordance-Synchronized Music Mixing for In-Car Audio Augmented Reality.
  • ACM UIST 2021. AirConstellations: In-Air Device Formations for Cross-Device Interaction via Multiple Spatially-Aware Armatures.
  • IEEE ISMAR 2021. TransforMR: Pose-Aware Object Substitution for Composing Alternate Mixed Realities.
  • IEEE ISMAR 2021. Gaze Comes in Handy: Predicting and Preventing Erroneous Hand Actions in AR-Supported Manual Tasks.
  • ACM IMWUT 2021. Smartphone-Based Tapping Frequency as a Surrogate for Perceived Fatigue. An In-the-Wild Feasibility Study in Multiple Sclerosis Patients.

Affiliations and collaborations

In our projects, we also collaborate with faculty and students at the Department of Health Sciences and Technology (D-HEST), the Department of Mechanical and Process Engineering (D-MAVT), and the Faculty of Medicine at University of Zurich. We are part of ETH's Competence Centre for Rehabilitation Engineering and Science and affiliated with the ETH AI Center as well as the Max Planck ETH Center for Learning Systems.