Animated Puppetry Controlled with VR.

I recently integrated mixed reality display into another interesting VR application.  The puppeteer controls the virtual puppet with the Oculus Rift, and is able to see and hear his audience with a vr only seen web camera and communicate with them effectively across a 2nd display that will not show the web camera image.  The video above demonstrates a rendition of this puppetry with an audience in a real time mixed reality display instead of just a typical 2nd display of the puppet that the audience would be viewing.

Networked Collaborative VR Application Mixed Reality

So I’m back with an update regarding displaying Mixed/Augmented Reality, but specifically for collaborative networked VR applications.  In comparison with other forms of VR, collaborative VR applications have been gaining popularity in recent times as being quite a successful enhancement over other renditions of collaborative computer programs.  Many companies have been pushing to create networked virtual offices, meeting rooms, and social spaces in VR.  Success has also been found in networked VR games that can emphasize collaborative gameplay spaces.

So I decided to bring the two worlds together for displaying more than 1 VR setup or user in the mixed reality display.  Doing a multi user application in mixed reality added an extra complexity, since many collaborative networked VR applications will allow players to pass through each other in the virtual environment.  This is because the 2 users won’t actually be sharing the same physical space, but this is not possible for 2 or more people to share the same real world space captured by the same camera used to demonstrate VR collaboration in mixed reality.  The mixed reality display should also be showing both the users properly interacting 1 to 1 in the same space as well.  This can be solved if all the networked computers and VR setups agree on the same calibrated space.  This allows the multiple users to easily interact in the same real world and virtual space at the same time, without colliding with each other.  Right now this is easiest to achieve with the HTC Vive, because the same lighthouse 3D emitters can be shared for each Vive HMD Sensor and the same calibration routine can be shared between multiple machines.

Since I originally started doing mixed reality display of VR, I have been pushing the limits of the current depth sensing fidelity so see how accurate I can mix the real world camera images and 3D graphics together.  I have been taking a look at how well virtual transparencies, lighting and shadows can be mixed into the equation.  Below is a small playlist or series of videos showing some of the things I been working with, and also demonstrates the shared calibration between multiple VR HMD/Trackers networked together.  Besides capturing myself in mixed reality during these videos, I have also included a 3D 360 View of the hardware I’ve been working with to make this all possible.  This video can also be viewed in a VR HMD such as google cardboard/daydream or with VR Video Players available for PC VR.  Apologies for the somewhat ranting nature of the narration of the videos, some things I wanted to communicate didn’t quite come out as apparent as I wanted them to.  I didn’t prepare something more formally to be said, and really just wanted to document some of these ideas in video form, but if you read this, then it should be more clear what is happening ; )

Mixing Realities

I’ve been working on a real time mixed reality setup with the goal of having an easy way to modify student VR applications to be displayed in real time to multiple outputs simultaneously.  This is to be used for recording or real time demonstration.  Below Dan Mapes a Director I work with at Full Sail University demonstrates some software I integrated to display in mixed reality.  The application shown was used regarding student internships for visualizing database information relationships in VR.

What is used here is a few things that have been developed in recent times that I incorporated together to make this happen.  One of these was a Stereo 3D depth sensing camera called the ZED camera, made by stereo labs.  This camera uses parallax between computer vision recognized images to build a depth per pixel at 1920×1080@30hz.  The other was the recently released vive tracker to properly track the ZED camera position and orientation in the real world.

WP_20170407_12_26_26_Rich

What happens is the real time 3D graphics use a perspective fov projection that matches that of the ZED camera fov based images.  The virtual camera is also positioned and oriented in the 3D space based on that of the vive tracker.

The ZED camera pulls in a typical RGB color, but also provides a depth per pixel.  This depth is matched to the units of the 3D virtual world.  The scales of images rendered inside the 3D world and captured from the camera are mathematically kept track of and matched so that when Dan scales the world around him, he and the virtual objects properly illustrate which is in front or behind.

Besides using depth to mix realities, Dan is also surrounded by green curtains that are used to filter in overarching background 3D graphics around him.  This isn’t always necessary if the depth of the virtual 3D objects are in front of the depths detected from the ZED camera.  Sometimes you may want to formally show more of the real world from the camera and only show virtual objects sitting on top or in front of the real objects.

Because the virtual camera is being tracked in real time, it can actually be moved while the application is running as if a camera man wants to change angles.  Below demonstrates a student vr “fear of heights” experience in real time that I also integrating with the mixed reality setup.

The recording of Dan’s VR perspective & the Mixed Reality in the first video were captured both in HD at a combined 3840×1080 resolution.  This was just barely achieved on a modern 4 core/8 thread i7 processor due to video encoding processing along side the application processing sometimes brushing up against 100% CPU utilization with rapid movement or changes on screen.  The issue can be rectified by using a processor capable of more threads, or using a 2nd GPU for encoding as to not interfere with the VR and Mixed Reality being rendered currently.  With that said, I plan on stress testing the mixed reality setup by live streaming the video capture with a slightly better computer.

VR is about to be Great!

Demoed all three new VR HMD setups during GDC2016. PS4 VR was physically user friendly, but has a slightly lower image quality compared to the new Rift & Vive VR. No doubt tweaked for cost and PS4 compatibility. That aside Sony’s VR games are really good by design, I like the party games the most. I demoed the new Oculus Rift with with Eve Valkyrie and it was extremely smooth with less latency than PlayStation VR. I had no motion sickness because they gradually fluctuate the acceleration very discretely in the game. I was spinning all over the place with no problem. I then got to try the Vive in the back room of a bar, with some very interesting guy also putting a haptics vest on me. It ran smooth as hell, and had some really fancy audio. Turns out I got lucky, because this guy has 20 Years experience in VR in the military. All three experiences were VR by design and all of the software and hardware is the real deal now. No more shoehorned experiences tossed in VR, no more motion sick based first experiences, Everything is smoothly 1 to 1. It’s all gonna be great. I also got to see a planetarium movie inside mobile VR made by a local University. The 3D movie illustrated the recent satellite landing on the comet.

It’s been almost 3 years since I started researching & programming VR related Graphics back with the Rift DK1. And now it’s about to get huge. I’m set to get some Multi GPU going for VR soon.

3D Dice Simulator on Android Mobile Phones, Tablets, Wearable Watches.

On Google Play

  3D Dice- screenshot thumbnail     3D Dice- screenshot thumbnail     3D Dice- screenshot thumbnail

All 6 RPG Dice.
Highly Detailed Physics with Sound – Integrated with Accelerometer and Touch
Complete Color Customization
Android Watch Wearable Bonus Application – Roll Dice in your Watch Wearable!
Note: Press and hold wearable watch screen to switch between dice selection and roll mode. Swipe-To-Dismiss as usual. Best on wearable watches with Android 5.0+.