VR hardware makers Vive released Vive Trackers to developers earlier this year: small trackers that can be attached to any physical object allowing it to be tracked in 3D space as efficiently as the Vive controllers and headset. The system knows the exact location and orientation of these trackers, so any physical object of known size that it is attached can now be duplicated/augmented in VR by creating a corresponding 3D model. This facilitates developers to create their own forms and interactions for custom game hardware: a number of custom guns are expected, as well as a pair of gloves with corresponding sensors to detect finger movement. Things get really interesting when gloves are combined with other tracker objects to create a fully tactile experience. Here is a nice of how multiple trackers can manipulate a rigged 3D model:
While the Lab awaits the arrival of the backordered trackers, we decided to do some quick and dirty experiments using the vive controllers as oversized position trackers, attaching them to our arms to determine hand position, and to other objects.
We developed this initial round of tests in WebVR, using the AFrame framework. The ThreeJS and AFrame sites will have all of the information on building Web VR experiences in the browser, and here is pretty a comprehensive guide on installing and configuring experimental browsers that are able to incorporate the Vive’s custom hardware.
First Object Test
Our first test was to track a user’s hand and a stool. This allowed us to see a representation of our hand and VR duplication of a real chair that we measured and modeled out. Our calibration was a little sloppy, and there are certainly limitations with the wrist tracking setup (including not being able to move the fingers), but the basic concept was there. But any imperfections are greatly magnified the farther you get from the locator to far points on the object.
And we see the general limitations of interfacing two tagged objects in VR space: When a real object (a controller) and a virtual one collide, there is a lot of forgiveness. However two real objects will reveal limitations of the mapping of real to virtual. As we move the chair around the room we definitely see how the relationship of what you feel based on your visible hand and the visible chair change accuracy. This is something that can be improved with tighter calibration as well as using multiple trackers on larger objects.
Out of Body Experience
Our second experiment is a simple out of body exercise. Inspired by the classic “rubber hand experiment”, whereby the brain can be tricked into feeling pain when a rubber hand is hit with a hammer. We attached a Vive controller to the chair allowing us to move the chair and virtual body around the room. A more developed version will have an animated avatar that can mimic the user’s movements.
Our third experiment is a simple bar experience: a real desk and stool are modeled and calibrated to match. The controllers are attached to a beer and a user’s wrist. Enjoyment follows.