Notice: Function _load_textdomain_just_in_time was called incorrectly. Translation loading for the twentyfifteen domain was triggered too early. This is usually an indicator for some code in the plugin or theme running too early. Translations should be loaded at the init action or later. Please see Debugging in WordPress for more information. (This message was added in version 6.7.0.) in /home/p0ppyspy/public_html/wp-includes/functions.php on line 6114
An Interactive Technology Blog – Computer Games, Simulation, Interactive Media

Designing Technology for the Real World Human Condition.

Designing interactive technology for the real world human condition is incredibly hard. While I often work with advanced AR/VR Interfaces assuming many human conditions are working properly. Humans in the real world are not always fully equipped for this expectation. The above reddit post is my testimonial and experience when helping my 97yr old Grandmother cope with Quarantine in a Nursing Home during the COVID-19 Pandemic. I essentially designed a video conferencing solution so she could properly visit given her very special needs at this age. I realized after creating this, that I learned a lot about the human computer interaction that I didn’t expect. That catering to someone like my grandmother with such extreme limitations was an extremely good learning experience.

What does it feel like to grab and scale the entire world?

Alice could tell you about Wonderland.

VR locomotion as a whole is a tricky thing, because essentially straying outside 1 to 1 tracking motion often means you will be inducing unexpected motion on the VR user. Often causing what can sometimes be called simulation sickness. Throwing off the users vestibular system system.

Most artificial or unnatural motion schemes just leave the user lacking the proper expectation of the motion they will under go. Unless given the time to learn such expectations. This can make many VR experiences either take a longer time or understand, or limit their overall effective length. Reasons many early VR experiences targeted a short length. Though I’d say younger users can wrap their brain around such motion interfaces faster, what could we do to attempt to alleviate unexpected motion?

Once such thing is enabling motion through grabbing and pulling which can surprisingly help the user expect what they are about to do. While some may get disoriented at first, the brain can often quickly connect to the expectation of grabbing and pulling through space. In my opinion much easier then just pressing a button or joystick to artificially move forward or rotate around an environment.

Why? well the user can’t possibly know at first how fast a button will move them, or how fast a joystick might rotate them. They’ll have to wrap their head around it first. While grabbing the world around them will instead give them an anchor to base their relative movment around.

While it may be too much freedom for many applications, it’s certainly powerful… being able to quickly maneuver and visualize from a desired perspective and even scale. It goes to say that 3D spatial understanding is not just complimented in VR, but is extend beyond what we are normally capable of experiencing.

Tools for computer aided design appear to stand to benefit from such free form locomotion.

Teaching 3D Interactive Mathematics inside VR/AR

In the past I’ve taught various 3D mathematics related to different forms of interactive media and graphics programming.  Usually communicating such things with more traditional methods such as paper and white board drawn representations.  I often found it difficult to help some people visualize some of the 3D concepts.

Introducing an interactive collaborative 3D Linear Algebra Teaching tool displayed in Mixed Reality. Designed at Full Sail University in the VR/AR Lab to better visualize and solve 3D mathematics inside VR and AR device displays. The Interactive application essentially teaches mathematic related to VR computer sciences inside a virtual environment where networked users can collaborate.

Tracking Calibrations

I teach in the AR/VR Lab at Full Sail University in the Simulation and Visualization BS where we commonly cover bringing tracking systems together so that they can properly coexist within the same interactive application.  This usually involves some type of calibration routine.  Below are 2 recent projects that incorporate such calibrations to bring other tracking methods into the same tracked space as the VR HMD tracking equipment.  It is our goal to enlighten our students with skills and techniques necessary to prototype simulation applications.  This technique is very important to making simulation based applications such as training applications.

Both projects do this calibration with a point based sampling calibration that involves sampling various points from both tracking systems to create frames necessary to incorporate 1 tracking method into another’s frame or vice versa.

I integrated both student projects with software to display them in 3rd person mixed reality so that they can demonstrate their systems.

In this Dental Simulation below designed by a masters game design student, we have a bachelors student who has brought Polhemus magnetic tracking together with the HTC Vive tracking.

In this Virtual Pilot Trainer a Bachelor student uses a leap motion as an interactive control panel someone can touch with their hands in a Boeing cockpit.  The leap motion hand tracking is then brought together with the HTC Vive tracking with this calibration.

 

Standalone Vive Tracking Unity Plugin

The Lighthouse Tracking system used in the HTC Vive is very good, however it usually coincides along side using a VR HMD.  Since I have applications that sometimes don’t require a VR headset, but still want the tracking, this led me to build software support for usage of the Standalone Vive Tracker without requiring a HTC Vive HMD.

This includes a calibration routine to build the Cartesian origin for a new tracking space enabling tracking using just 1 tracker and 1 lighthouse base emitter.

I also brought the software over into the Unity Engine as a C++ dll.  This is because other methods for using the tracker cause Unity to Initialize a VR back end for rendering functionality.  I afterwards packaged this as a plugin utility on the Unity Store to allow development of non VR HMD applications that desire the tracking functionality.  You can find it at this Link.  Currently only supports Windows x64, but it can be re-factored to support other platforms if desired.

Animated Puppetry Controlled with VR.

I recently integrated mixed reality display into another interesting VR application.  The puppeteer controls the virtual puppet with the Oculus Rift, and is able to see and hear his audience with a vr only seen web camera and communicate with them effectively across a 2nd display that will not show the web camera image.  The video above demonstrates a rendition of this puppetry with an audience in a real time mixed reality display instead of just a typical 2nd display of the puppet that the audience would be viewing.

Networked Collaborative VR Application Mixed Reality

So I’m back with an update regarding displaying Mixed/Augmented Reality, but specifically for collaborative networked VR applications.  In comparison with other forms of VR, collaborative VR applications have been gaining popularity in recent times as being quite a successful enhancement over other renditions of collaborative computer programs.  Many companies have been pushing to create networked virtual offices, meeting rooms, and social spaces in VR.  Success has also been found in networked VR games that can emphasize collaborative gameplay spaces.

So I decided to bring the two worlds together for displaying more than 1 VR setup or user in the mixed reality display.  Doing a multi user application in mixed reality added an extra complexity, since many collaborative networked VR applications will allow players to pass through each other in the virtual environment.  This is because the 2 users won’t actually be sharing the same physical space, but this is not possible for 2 or more people to share the same real world space captured by the same camera used to demonstrate VR collaboration in mixed reality.  The mixed reality display should also be showing both the users properly interacting 1 to 1 in the same space as well.  This can be solved if all the networked computers and VR setups agree on the same calibrated space.  This allows the multiple users to easily interact in the same real world and virtual space at the same time, without colliding with each other.  Right now this is easiest to achieve with the HTC Vive, because the same lighthouse 3D emitters can be shared for each Vive HMD Sensor and the same calibration routine can be shared between multiple machines.

Since I originally started doing mixed reality display of VR, I have been pushing the limits of the current depth sensing fidelity so see how accurate I can mix the real world camera images and 3D graphics together.  I have been taking a look at how well virtual transparencies, lighting and shadows can be mixed into the equation.  Below is a small playlist or series of videos showing some of the things I been working with, and also demonstrates the shared calibration between multiple VR HMD/Trackers networked together.  Besides capturing myself in mixed reality during these videos, I have also included a 3D 360 View of the hardware I’ve been working with to make this all possible.  This video can also be viewed in a VR HMD such as google cardboard/daydream or with VR Video Players available for PC VR.  Apologies for the somewhat ranting nature of the narration of the videos, some things I wanted to communicate didn’t quite come out as apparent as I wanted them to.  I didn’t prepare something more formally to be said, and really just wanted to document some of these ideas in video form, but if you read this, then it should be more clear what is happening ; )

Mixing Realities

I’ve been working on a real time mixed reality setup with the goal of having an easy way to modify student VR applications to be displayed in real time to multiple outputs simultaneously.  This is to be used for recording or real time demonstration.  Below Dan Mapes a Director I work with at Full Sail University demonstrates some software I integrated to display in mixed reality.  The application shown was used regarding student internships for visualizing database information relationships in VR.

What is used here is a few things that have been developed in recent times that I incorporated together to make this happen.  One of these was a Stereo 3D depth sensing camera called the ZED camera, made by stereo labs.  This camera uses parallax between computer vision recognized images to build a depth per pixel at 1920×1080@30hz.  The other was the recently released vive tracker to properly track the ZED camera position and orientation in the real world.

WP_20170407_12_26_26_Rich

What happens is the real time 3D graphics use a perspective fov projection that matches that of the ZED camera fov based images.  The virtual camera is also positioned and oriented in the 3D space based on that of the vive tracker.

The ZED camera pulls in a typical RGB color, but also provides a depth per pixel.  This depth is matched to the units of the 3D virtual world.  The scales of images rendered inside the 3D world and captured from the camera are mathematically kept track of and matched so that when Dan scales the world around him, he and the virtual objects properly illustrate which is in front or behind.

Besides using depth to mix realities, Dan is also surrounded by green curtains that are used to filter in overarching background 3D graphics around him.  This isn’t always necessary if the depth of the virtual 3D objects are in front of the depths detected from the ZED camera.  Sometimes you may want to formally show more of the real world from the camera and only show virtual objects sitting on top or in front of the real objects.

Because the virtual camera is being tracked in real time, it can actually be moved while the application is running as if a camera man wants to change angles.  Below demonstrates a student vr “fear of heights” experience in real time that I also integrating with the mixed reality setup.

The recording of Dan’s VR perspective & the Mixed Reality in the first video were captured both in HD at a combined 3840×1080 resolution.  This was just barely achieved on a modern 4 core/8 thread i7 processor due to video encoding processing along side the application processing sometimes brushing up against 100% CPU utilization with rapid movement or changes on screen.  The issue can be rectified by using a processor capable of more threads, or using a 2nd GPU for encoding as to not interfere with the VR and Mixed Reality being rendered currently.  With that said, I plan on stress testing the mixed reality setup by live streaming the video capture with a slightly better computer.

VR is about to be Great!

Demoed all three new VR HMD setups during GDC2016. PS4 VR was physically user friendly, but has a slightly lower image quality compared to the new Rift & Vive VR. No doubt tweaked for cost and PS4 compatibility. That aside Sony’s VR games are really good by design, I like the party games the most. I demoed the new Oculus Rift with with Eve Valkyrie and it was extremely smooth with less latency than PlayStation VR. I had no motion sickness because they gradually fluctuate the acceleration very discretely in the game. I was spinning all over the place with no problem. I then got to try the Vive in the back room of a bar, with some very interesting guy also putting a haptics vest on me. It ran smooth as hell, and had some really fancy audio. Turns out I got lucky, because this guy has 20 Years experience in VR in the military. All three experiences were VR by design and all of the software and hardware is the real deal now. No more shoehorned experiences tossed in VR, no more motion sick based first experiences, Everything is smoothly 1 to 1. It’s all gonna be great. I also got to see a planetarium movie inside mobile VR made by a local University. The 3D movie illustrated the recent satellite landing on the comet.

It’s been almost 3 years since I started researching & programming VR related Graphics back with the Rift DK1. And now it’s about to get huge. I’m set to get some Multi GPU going for VR soon.

3D Dice Simulator on Android Mobile Phones, Tablets, Wearable Watches.

On Google Play

  3D Dice- screenshot thumbnail     3D Dice- screenshot thumbnail     3D Dice- screenshot thumbnail

All 6 RPG Dice.
Highly Detailed Physics with Sound – Integrated with Accelerometer and Touch
Complete Color Customization
Android Watch Wearable Bonus Application – Roll Dice in your Watch Wearable!
Note: Press and hold wearable watch screen to switch between dice selection and roll mode. Swipe-To-Dismiss as usual. Best on wearable watches with Android 5.0+.