skip to Main Content
Project Description

During my stint as an intern at Veldis Experience Pte Ltd, one of the projects I worked on was a VR motion tracked training simulation for the Republic of Singapore Air Force. This project was done in collaboration with Aviation Learn Pte Ltd.

The training simulation was made for aircraft launch crew to learn and practice pre-flight routines and safety checks.

Details
VR and Real-time Motion Tracking Training Simulator
3DVIA Studio, C++, ART
Mar 2012 – July 2012, 4 month timeline
Being a Programming Intern

As the company i worked at was smaller in size, I was able to play an important role in projects despite being a mere intern.

As one of the on site developers, some of the tasks i was assigned to a did are:

  • Setup and Calibration of VR and Motion Tracking.
  • Conducting and creating Unit, Integration and System tests and test cases.
  • Programming intuitive controls for non traditional input/control setup.
  • Ensuring features I made did not surpass target frame rates and that gameplay was smooth.
Developing and Designing for Non Traditional Input.

One of the specific tasks I did was to create a way of detecting if the user was grabbing an object. As the motion capture set up had trackers on each finger joint, the input fed into the game engine was similar to a human hand rig in game.

 

The hand data mapped to a 3D rig looked something like that.
The hand data mapped to a 3D rig looked something like that.

The initial solution was to use basic trigonometry to calculate the angles between each segment of the finger at the joints. Then if said angles were smaller than a threshold angle , the hand was determined as closed. However the finger tracking device did not provide data for each joint only the finger tips. The angles I was using came from IK simulations and as such, this method was too expensive computationally.

 

 

At the end of the day, a simple solution using simple sphere collision checks was found. By adding a large sphere at the palm and smaller spheres at the finger tips, collision between most fingers and the palm would indicate a closed palm. The results were not as accurate as the initial solution however performed 20% faster.

ART 5 depth of field hand tracker used.
ART 5 depth of field hand tracker used.
Optical motion tracking

The main issue faced with optical motion tracking was that markers, both active and passive would sometimes be covered and obscured to the IR cameras. As such the data feed into the game engine was not always clean or accurate.
As such, I conceived a simple dead reckoning system to check input data and if the new position of a marker exceeded a distance threshold, the extrapolated point would be used instead.

Back To Top