🏆Mo2Hap:Rendering VR Performance Motion Flow to Upper-body Vibrotactile Haptic Feedback
Kyungeun Jung, and Sang Ho Yoon
Symposium on User Interface Software and Technology (UIST Demo), Apr 2023
We introduce a unique haptic rendering framework that transforms the performer’s actions into wearable vibrotactile feedback for an immersive virtual reality (VR) performance experience. To capture essential movements from the virtual performer, we propose a method called Motion Salient Triangle. Motion Salient Triangle is a real-time 3D polygon that computes haptic characteristics (intensity, location) based on motion skeletal data. Here, we employ an entire upper-body haptic system that provides vibrotactile feedback on the torso, back, and shoulders. This haptic rendering pipeline enable audiences to experience immersive VR performance by accommodating the performer’s motions on top of motion-to-haptic feedback.