Designing Technology for the Real World Human Condition.

Designing interactive technology for the real world human condition is incredibly hard. While I often work with advanced AR/VR Interfaces assuming many human conditions are working properly. Humans in the real world are not always fully equipped for this expectation. The above reddit post is my testimonial and experience when helping my 97yr old Grandmother cope with Quarantine in a Nursing Home during the COVID-19 Pandemic. I essentially designed a video conferencing solution so she could properly visit given her very special needs at this age. I realized after creating this, that I learned a lot about the human computer interaction that I didn’t expect. That catering to someone like my grandmother with such extreme limitations was an extremely good learning experience.

What does it feel like to grab and scale the entire world?

Alice could tell you about Wonderland.

VR locomotion as a whole is a tricky thing, because essentially straying outside 1 to 1 tracking motion often means you will be inducing unexpected motion on the VR user. Often causing what can sometimes be called simulation sickness. Throwing off the users vestibular system system.

Most artificial or unnatural motion schemes just leave the user lacking the proper expectation of the motion they will under go. Unless given the time to learn such expectations. This can make many VR experiences either take a longer time or understand, or limit their overall effective length. Reasons many early VR experiences targeted a short length. Though I’d say younger users can wrap their brain around such motion interfaces faster, what could we do to attempt to alleviate unexpected motion?

Once such thing is enabling motion through grabbing and pulling which can surprisingly help the user expect what they are about to do. While some may get disoriented at first, the brain can often quickly connect to the expectation of grabbing and pulling through space. In my opinion much easier then just pressing a button or joystick to artificially move forward or rotate around an environment.

Why? well the user can’t possibly know at first how fast a button will move them, or how fast a joystick might rotate them. They’ll have to wrap their head around it first. While grabbing the world around them will instead give them an anchor to base their relative movment around.

While it may be too much freedom for many applications, it’s certainly powerful… being able to quickly maneuver and visualize from a desired perspective and even scale. It goes to say that 3D spatial understanding is not just complimented in VR, but is extend beyond what we are normally capable of experiencing.

Tools for computer aided design appear to stand to benefit from such free form locomotion.

Teaching 3D Interactive Mathematics inside VR/AR

In the past I’ve taught various 3D mathematics related to different forms of interactive media and graphics programming.  Usually communicating such things with more traditional methods such as paper and white board drawn representations.  I often found it difficult to help some people visualize some of the 3D concepts.

Introducing an interactive collaborative 3D Linear Algebra Teaching tool displayed in Mixed Reality. Designed at Full Sail University in the VR/AR Lab to better visualize and solve 3D mathematics inside VR and AR device displays. The Interactive application essentially teaches mathematic related to VR computer sciences inside a virtual environment where networked users can collaborate.

Tracking Calibrations

I teach in the AR/VR Lab at Full Sail University in the Simulation and Visualization BS where we commonly cover bringing tracking systems together so that they can properly coexist within the same interactive application.  This usually involves some type of calibration routine.  Below are 2 recent projects that incorporate such calibrations to bring other tracking methods into the same tracked space as the VR HMD tracking equipment.  It is our goal to enlighten our students with skills and techniques necessary to prototype simulation applications.  This technique is very important to making simulation based applications such as training applications.

Both projects do this calibration with a point based sampling calibration that involves sampling various points from both tracking systems to create frames necessary to incorporate 1 tracking method into another’s frame or vice versa.

I integrated both student projects with software to display them in 3rd person mixed reality so that they can demonstrate their systems.

In this Dental Simulation below designed by a masters game design student, we have a bachelors student who has brought Polhemus magnetic tracking together with the HTC Vive tracking.

In this Virtual Pilot Trainer a Bachelor student uses a leap motion as an interactive control panel someone can touch with their hands in a Boeing cockpit.  The leap motion hand tracking is then brought together with the HTC Vive tracking with this calibration.

 

Standalone Vive Tracking Unity Plugin

The Lighthouse Tracking system used in the HTC Vive is very good, however it usually coincides along side using a VR HMD.  Since I have applications that sometimes don’t require a VR headset, but still want the tracking, this led me to build software support for usage of the Standalone Vive Tracker without requiring a HTC Vive HMD.

This includes a calibration routine to build the Cartesian origin for a new tracking space enabling tracking using just 1 tracker and 1 lighthouse base emitter.

I also brought the software over into the Unity Engine as a C++ dll.  This is because other methods for using the tracker cause Unity to Initialize a VR back end for rendering functionality.  I afterwards packaged this as a plugin utility on the Unity Store to allow development of non VR HMD applications that desire the tracking functionality.  You can find it at this Link.  Currently only supports Windows x64, but it can be re-factored to support other platforms if desired.

Animated Puppetry Controlled with VR.

I recently integrated mixed reality display into another interesting VR application.  The puppeteer controls the virtual puppet with the Oculus Rift, and is able to see and hear his audience with a vr only seen web camera and communicate with them effectively across a 2nd display that will not show the web camera image.  The video above demonstrates a rendition of this puppetry with an audience in a real time mixed reality display instead of just a typical 2nd display of the puppet that the audience would be viewing.