I recently integrated mixed reality display into another interesting VR application. The puppeteer controls the virtual puppet with the Oculus Rift, and is able to see and hear his audience with a vr only seen web camera and communicate with them effectively across a 2nd display that will not show the web camera image. The video above demonstrates a rendition of this puppetry with an audience in a real time mixed reality display instead of just a typical 2nd display of the puppet that the audience would be viewing.
Month: June 2017
Networked Collaborative VR Application Mixed Reality
So I’m back with an update regarding displaying Mixed/Augmented Reality, but specifically for collaborative networked VR applications. In comparison with other forms of VR, collaborative VR applications have been gaining popularity in recent times as being quite a successful enhancement over other renditions of collaborative computer programs. Many companies have been pushing to create networked virtual offices, meeting rooms, and social spaces in VR. Success has also been found in networked VR games that can emphasize collaborative gameplay spaces.
So I decided to bring the two worlds together for displaying more than 1 VR setup or user in the mixed reality display. Doing a multi user application in mixed reality added an extra complexity, since many collaborative networked VR applications will allow players to pass through each other in the virtual environment. This is because the 2 users won’t actually be sharing the same physical space, but this is not possible for 2 or more people to share the same real world space captured by the same camera used to demonstrate VR collaboration in mixed reality. The mixed reality display should also be showing both the users properly interacting 1 to 1 in the same space as well. This can be solved if all the networked computers and VR setups agree on the same calibrated space. This allows the multiple users to easily interact in the same real world and virtual space at the same time, without colliding with each other. Right now this is easiest to achieve with the HTC Vive, because the same lighthouse 3D emitters can be shared for each Vive HMD Sensor and the same calibration routine can be shared between multiple machines.
Since I originally started doing mixed reality display of VR, I have been pushing the limits of the current depth sensing fidelity so see how accurate I can mix the real world camera images and 3D graphics together. I have been taking a look at how well virtual transparencies, lighting and shadows can be mixed into the equation. Below is a small playlist or series of videos showing some of the things I been working with, and also demonstrates the shared calibration between multiple VR HMD/Trackers networked together. Besides capturing myself in mixed reality during these videos, I have also included a 3D 360 View of the hardware I’ve been working with to make this all possible. This video can also be viewed in a VR HMD such as google cardboard/daydream or with VR Video Players available for PC VR. Apologies for the somewhat ranting nature of the narration of the videos, some things I wanted to communicate didn’t quite come out as apparent as I wanted them to. I didn’t prepare something more formally to be said, and really just wanted to document some of these ideas in video form, but if you read this, then it should be more clear what is happening ; )