Scion MegaWall

The Scion MegaWall was an application that was built for the 2014 Auto show and ran in LA, Detroit and NewYork. The application was built to integrate into a video wall via a show control system (Pandora). The application communicated with the show control system via a set of custom JSON messages sent over the network on TCP. The application could be configured on-the-fly to run in a number of different modes all controlled via xml configuration. This allowed the video wall to seamlessly transition from brand vide to the application and back.

In addition to the playback sequencing the wall took a number of live feeds including live weather, rss, instagram images and data from other on-site systems. One of these systems was a vending machine that was triggered when someone activated the machine via NFC. The application could also run interrupted regardless of an internet dropouts.

Kinect Video Wall

This project was originally part of the the Cisco House pavilion and was build upon for an installation in Cisco's Global Centre of Excellence in Songdo, South Korea. This video wall is still active and is the first digital feature that welcomes guests upon arrival to the centre.

The application is controlled via gesture controls using the Kinect depth camera. The main feature is a library of Cisco brand videos divided into three categories. A user can select a video category with their left hand and then choose a video with their right hand. In addition to the video player, a user can also initiate one of two live video conferences simply by standing on a hotspot marked by a floor decal.

A second custom Kinect interpretor application was built to support the primary app. Kinect data is optimised and sent to the main application over UDP on the local network. This application has since been reused in many other kinect controlled applications.

Toyota In-Car App

This is a native iOS application developed for the US Auto shows. It features a number of product sales presentations and brand content media and runs on iPad minis mounted inside the cars. A set of videos of sales staff were produced for each car and the application seamlessly transitions between the videos including an additional idle video running on a loop. For each vehicle both english and spanish language videos were produced.

The iOS application uses a CMS to retrieve content for each vehicle and provides rich user tracking data for each interaction. A problem with networking at large public events is that WiFi cannot be relied upon when the show is running. Consequently the application has a synchronising mechanism where the iPad can be configured at the start of the show and download the required content to function offline. Tracking data is then retained locally and only sent when there is a reliable connection.

The application was published using an Enterprise licence which enables deployment via wifi outside of the appStore. The effect is that the iPads can be setup from scratch in a mater of minutes and then operate autonomously for the full duration of the event.


This is a QR code scanner built in Objective-C. This app was an exercise in using Obj-C and used a number of libraries including the zxing, MapKit, CoreLocation and a QR encoder. I was also very disappointed with the quality of QR code scanners available and wanted to contribute an alternative app with more pleasing design.

Click the image to view the app in the app store.

RGBD Experiments

RGBD Toolkit is an application suite developed to produce 3D video that combines the kinect depth camera and a DSLR or webcam. Effectively it maps colour video frames onto 3D point cloud mesh. The software allows you to modify various effects such as the size and brightness of the point cloud nodes, mesh lines, video frame and depth of field.

It is a fairly painful process to build a rig the couples the kinect with a camera and then calibrate the two lenses. However once the lenses are calibrated then capturing sequences is easy. The toolkit was built in OF and has a calibration, capture and edit components.

The software was later used by the motion graphics team in a Samsung Smart TV launch.

Lost Diggers

In February 2012 thousands of glass plate negatives were found in an attic in France. These images had been captured over 90 years earlier during the First World War and showed soldiers at rest in the village of Vignacourt during leave from the front. The Australian plates were bought by Kerry Stokes and donated to the Australian War Memorial.

This applications was build in actionscript and used the Starling framework. Images were stored locally however the application used webservices to update comments and submit comments. A requirement of the project was that it had to run offline, so all successful url requests were cached locally. Also any user generated comments were stored locally and synched to the backend whenever an internet connection became active.

For me personally this project had significance as my great uncle is buried in the Vignacourt War Cemetary and may well have been one of those lost diggers. I later visited the town of Vignacourt where the photographs were taken to pay my respects.


I've built and continue to maintain a popular Australian tax calculator website To complement the website I created an iOS version. The app features a single view with collapsable sections to provide more detail on demand. This allows for a vary clean and simple UI without having to switch back and forth between different views. In addition, the application checks an online data source to check for any changes to the configuration which allows the app to be up-to-date. The application is also Universal to both the iPhone and iPad.

Click the image below to view the app in the Apple AppStore.

London AR Viewer

The application was part of a suite of media experiences for a Cisco pavilion in London for the London Olympic Games. The pavilion was situated on top of a building adjacent the Olympic site and had a sweeping view of the Olympic site. This application was built in Unity and I worked with a 3D modeller who developed the London Model.

On the floor of the building was an abstract decal of the river Thames and the application augmented this floor decal with the digital model. This was accomplished by housing the application in a pivoting screen enclosure that was fitted with a video camera pointing at the floor decal. An angular sensor was used to determine the rotation of the screen and the live video feed was placed behind the model.

In addition, a second version of the application was placed outside in a static enclosure. This version could be interacted with via touch and swipe gestures.

Honda AR

This application involved augmenting a view of a vehicle on a turntable with a series of animated vignettes to emphasise the technology and features built into the car.

An Arduino monitored an angular sensor attached to the turntable gearbox and broadcast data over the local network as OSC messages. The Unity application then received this data and synchronised the rotation of the model with the physical car. A touch screen placed in font of the screen allowed a user to call up any of the nine animations. The touchscreen also positioned the user in the ideal spot to view the augmented scene.

The modelling and animations were composed in Maya and exported as a Unity package and then given some shiny paint by some shader experts. This was my first large Unity application and it was a great opportunity to learn the software and C# programming.


These are some experiments I made with Openframeworks. I used the book "Interactive Design" by Josh Noble to get me started. The first example is a replication of a piece done by the Sydney Developer Lukasz Karluk and the others are a reworking of some as3 pieces. I often like to replicate works I have seen as a personal challenge and to help focus my efforts.