25/04/2020

FaceHaptics: Robot Arm based Versatile Facial Haptics for Immersive Environments

Alexander Wilberz, Dominik Leschtschow, Christina Trepkowski, Jens Maiero, Ernst Kruijff, Bernhard Riecke

Keywords: haptics, robot arm, immersive environments, virtual reality, user study, perception, presence, emotion

Abstract: This paper introduces FaceHaptics, a novel haptic display based on a robot arm attached to a head-mounted virtual reality display. It provides localized, multi-directional and movable haptic cues in the form of wind, warmth, moving and single-point touch events and water spray to dedicated parts of the face not covered by the head-mounted display.The easily extensible system, however, can principally mount any type of compact haptic actuator or object. User study 1 showed that users appreciate the directional resolution of cues, and can judge wind direction well, especially when they move their head and wind direction is adjusted dynamically to compensate for head rotations. Study 2 showed that adding FaceHaptics cues to a VR walkthrough can significantly improve user experience, presence, and emotional responses.

The video of this talk cannot be embedded. You can watch it here:
https://www.youtube.com/watch?v=as5ZnamLy74
(Link will open in new window)
 0
 0
 0
 0
This is an embedded video. Talk and the respective paper are published at CHI 2020 virtual conference. If you are one of the authors of the paper and want to manage your upload, see the question "My papertalk has been externally embedded..." in the FAQ section.

Comments

Post Comment
no comments yet
code of conduct: tbd Characters remaining: 140

Similar Papers