AR Sandbox

OVERVIEW

Among one of five sandbox exhibits in British Columbia, I had the opportunity to explore new software concepts for a co-op term under my university’s co-op program. The company utilized this opportunity as an educational tool that would be brought to schools for learning about their surrounding environment.

The sandbox is a 30″ x 40″ table with sand and borders around to keep the sand within the bin. The sand in particular is Santastik sand, moldable and safe for sculpting. Above the table lies a short throw projector and a Kinect device. The two devices work together with an open-source software to detect and project any displacements of sand and readjust a colour corresponding depth.

ROLE

Sandbox Exhibit Facilitator

UX

4 month co-op term

BACKGROUND

The early inception of the sandbox could detect changes of depth within a reasonable response time, but wasn’t near-instantaneous to meet standards. It was common to get inaccurate readings and shifts in the color palette.

GOAL

My co-op term involved simplifying the operations and be in charge of setup and takedown of the sandbox when taken to venues. Secondarily, explore post-developmental features to further enhance the sandbox capabilities.

Design Process

Calibration

A full and accurate calibration is needed in order to connect the playing region of the sandbox with the sensors. The calibration is broken into the four following steps:

  1. Per-pixel depth: surveys the different depth and captures surface points
  2. Intrinsic: captures a one-to-one representation of the sand surface from the Kinect sensor
  3. Extrinsic: explicit measuring of the sand-surface depth and corner points of the table.
  4. Projection: creation of tie points to synchronize coordinate points of the sand with the projector

Features

By utilizing the Kinect depth sensor, the AR Sandbox is able to project contour lines to visualize different elevation levels. With the further enhancements, I achieved simulating fluid motion visual effects like water, snow and lava. Each fluid has their own parameters to its natural behaviour. Additionally I created presets of different colour schemes to convey the common biomes and height of mountainous terrain.

Another feature I integrated was a digital elevation map- a three-dimensional graphical layer used to represent mapping and terrain surfaces. This changes the colour scheme to red and blue when the software is fed a TIFF file. A red projection indicates the sand is piled too high needs to be moved towards the blue areas, where the sand level is too shallow. When a surface is correct, the colour turns white. Upon a desired accuracy, the facilitator can turn off the feature and the result will be reflected to the uploaded image.

Retrospective Thoughts

While the foundation of the development has been simplified, another concept lies in the educational portion. Lesson plans need to be scripted for future educators during presentations. I had the privilege to collaborate with the creator of the AR sandbox and the majority of the open source developers, whom all helped with troubleshooting and support.

Some of the main challenges were fully understanding the complexity of GIS backgrounds and simplifying it for kids to understand. A lot of feedback was received in positive excitement being provided the ability to change a landscape and see immediate feedback. I'm appreciative for the opportunity to contribute towards the advancement of augmented reality.