Face Swap

In the lead-up to a new permanent gallery opening at the Australian Museum, I created an interactive SnapChat style filter application that allowes visitors to switch their face over to one of three famous Australians. The app was built in OpenFrameworks and is derived from an experiment done by Kyle McDonald and Arturo Castro. The installation has two touchscreens powered by mac minis housed inside a mobile enclosure. The app has a feature where an image can be emailed to a user, allowing it to be shared on social media. With a background in Industrial design, I was able to produce CAD drawing of the enclosure and consult with the cabinet makers to manufacture the custom unit. The app has been running continuously without any crashed or maintenance required for over 14 months and is used daily by visitors to the Museum.


I worked within a team to build this site. My responsibility was the store location map, fan finds and blog pages. The site is using AngularJS, Google maps, Masonry, and it has a wordpress/Sitefinity backend.

Audi Q5

I created this 3D scene for the Audi Q5 as a side project during some down time at work. The scene is created in Three.js and the model was purchased from Turbosquid. For a tech demo, this was used with a Leap Motion controller to use your and to interact with the camera.

Smartest A4 Paper

This a prototype for a digital engagement that allowed a piece of plain paper to interact with content on a mobile device. Data was transferred from electronics embedded in the paper to the app via audio encoded message known as Chirps. This was decoded using the Chirp SDK. The circuit was designed to be only 3 mm thick and be embedded in the spine on the edge of the paper. Engineers were engaged and manufactures in China identified. Is it dead?


This dancing cockatoo was in response to a challenge to replicate Pablo the Flamingo. There are a few tings goin on here to achieve the effect. The rendering is done with Three.JS and I used Blender to rig the bird model from a flat image. The motion was achieved by mapping vertices to a "soft-body" structure using Physics.JS. It is possible to change the audio that the bird dances to. To get the volume levels from the audio I preprocessed the audio track with Sound.JS, this creates a bitmap image where each pixel colour corresponds to a volume level at a time point in the audio track. This bird briefly appeared in the contact us page of the old 303 Lowe website.
This sketch demonstrated how the bird is modeled on a soft body physics structure.

Golden Helmet

This was used as an April fools joke for Harley. It was quickly put together in one evening, but it just so happened that I was working on a 3D model of a bike helmet the same day so I was able to get the page up quickly.

The model was sourced on Turbo Squid and is about 30000 polygons which is a good size for webGL. I used Three.js to bring the model in and apply the different textures, materials and shaders to the different meshes. It also works on mobile and the rotation can be controlled by tilting the phone.


I was part of a team that created a campaign website for the Inter-Generation Report (IGR) in 2015.  The project features 360 video sequences that transition the site between different states around Australia - like a road trip. This process of capturing these images involved scraping sequences of high res-panorama image tiles from Google. I used some javascript to extract panoramaID's from Google streetview  into a JSON object and then built an Air app to download, compile and save out the image tiles from each pano location.
I had a bit of fun with these images and created a stereographic projection version here:

Hotline Pudding

Many agency's put out an eDM or microsite for the Agency to celebrate the season. We decided this year to do some thing a little digital and a little unusual. We made a singing Christmas pudding.

The pudding is running a BareCoductive development board. The code was substantially modified to play back a series of MIDI tracks. The MIDI tracks were re-encoded into a custom format to allow the main melody sequence to be played back by tapping the pudding. Using a capacitive touch sensor, a single wire connected to the pudding provided a good conductive interface.

The plinths - made up in my kitchen, housed an internal LED floodlight controlled by the MIDI sequencer gave active visual feedback and a small stereo amplifier provided a deep sound and pleasing haptic response.

Acura Connected Living

This multitouch Air application features the full suite of Acura product information, presented in a rich and engaging interactive environment. Content is synchronised over the cloud making updates effortless. I was involved in a recent addition to the application, a particle effect attractor state controlled by kinect cameras. The camera was mounted below the screen and monitored the movement of people near-by. Users positions were translated to particle attractors which created a personal effect of being able to casually influence the computer graphics on the digital screens.

I used the in-spirit open source fluid solver library to create the visual effect with a few modifications to the behaviour and other visual effects. The fluid solver renders to bitmap data and performed remarkably well. This was anther re-use of an existing Air Kinect application that plugged in with minimal setup.