Games Graphics Research Studies

Haptic Feedback in Room-Scale VR

‘Haptic Feedback in Room-Scale VR’ is the diploma thesis of Philipp Erler. It contains three programs for the HTC Vive to (1) analyze grabbing and throwing with controllers, (2) examine the influence of haptic and optical feedback on grabbing, (3) clear point clouds in VR.

Simple Basketball
Grabbing Test
Point-Cloud Editing


VR Programs Source (F#, MS VS 2015)
Survey Analyzer Source (F#, MS VS 2015)
VRVis on Github


Virtual reality (VR) is now becoming a mainstream medium. Current systems like the HTC Vive offer accurate tracking of the HMD and controllers, which allows for highly immersive interactions with the virtual environment. The interactions can be further enhanced by adding feedback. As an example, a controller can vibrate when it is close to a grabbable ball.

As such interactions are not exhaustingly researched, we conducted a user study. Specifically, we examine:

  1. grabbing and throwing with controllers in a simple basketball game.
  2. the influence of haptic and optical feedback on performance, presence, task load, and usability.
  3. the advantages of VR over desktop for point-cloud editing.

Several new techniques emerged from the point-cloud editor for VR. The bi-manual pinch gesture, which extends the the handlebar metaphor, is a novel viewing method used to translate, rotate, and scale the point-cloud. Our new rendering technique uses the geometry shader to draw sparse point clouds quickly. The selection volumes at the controllers are our new technique to efficiently select points in point clouds. The resulting selection is visualized in real time.

The results of the user study show that:

  1. grabbing with a controller button is intuitive but throwing is not. Releasing a button is a bad metaphor for releasing a grabbed virtual object in order to throw it.
  2. any feedback is better than none. Adding haptic, optical, or both feedback types to the grabbing improves the user performance and presence. However, only sub-scores like accuracy and predictability are significantly improved. Usability and task load are mostly unaffected by feedback.
  3. the point-cloud editing is significantly better in VR with the bi-manual pinch gesture and selection volumes than on the desktop with the orbiting camera and lasso selections.


Textures, converted to compressed formats

Hoop model and textures
Point-cloud dataset by VRVis

Sounds, exported as wav with Microsoft encoding

2 replies on “Haptic Feedback in Room-Scale VR”

Leave a Reply

Your email address will not be published. Required fields are marked *