Category Archives: Experiments

Audi Q5

I created this 3D scene for the Audi Q5 as a side project during some down time at work. The scene is created in Three.js and the model was purchased from Turbosquid. For a tech demo, this was used with a Leap Motion controller to use your and to interact with the camera.

Smartest A4 Paper

This a prototype for a digital engagement that allowed a piece of plain paper to interact with content on a mobile device. Data was transferred from electronics embedded in the paper to the app via audio encoded message known as Chirps. This was decoded using the Chirp SDK. The circuit was designed to be only 3 mm thick and be embedded in the spine on the edge of the paper. Engineers were engaged and manufactures in China identified. Is it dead?

Cockatoo

This dancing cockatoo was in response to a challenge to replicate Pablo the Flamingo. There are a few tings goin on here to achieve the effect. The rendering is done with Three.JS and I used Blender to rig theĀ bird modelĀ from a flat image. The motion was achieved by mapping vertices to a "soft-body" structure using Physics.JS. It is possible to change the audio that the bird dances to. To get the volume levels from the audio I preprocessed the audio track with Sound.JS, this creates a bitmap image where each pixel colour corresponds to a volume level at a time point in the audio track. This bird briefly appeared in the contact us page of the old 303 Lowe website.
This sketch demonstrated how the bird is modeled on a soft body physics structure.

Hotline Pudding

Many agency's put out an eDM or microsite for the Agency to celebrate the season. We decided this year to do some thing a little digital and a little unusual. We made a singing Christmas pudding.

The pudding is running a BareCoductive development board. The code was substantially modified to play back a series of MIDI tracks. The MIDI tracks were re-encoded into a custom format to allow the main melody sequence to be played back by tapping the pudding. Using a capacitive touch sensor, a single wire connected to the pudding provided a good conductive interface.

The plinths - made up in my kitchen, housed an internal LED floodlight controlled by the MIDI sequencer gave active visual feedback and a small stereo amplifier provided a deep sound and pleasing haptic response.

RGBD Experiments

RGBD Toolkit is an application suite developed to produce 3D video that combines the kinect depth camera and a DSLR or webcam. Effectively it maps colour video frames onto 3D point cloud mesh. The software allows you to modify various effects such as the size and brightness of the point cloud nodes, mesh lines, video frame and depth of field.

It is a fairly painful process to build a rig the couples the kinect with a camera and then calibrate the two lenses. However once the lenses are calibrated then capturing sequences is easy. The toolkit was built in OF and has a calibration, capture and edit components.

The software was later used by the motion graphics team in a Samsung Smart TV launch.

Openframeworks

These are some experiments I made with Openframeworks. I used the book "Interactive Design" by Josh Noble to get me started. The first example is a replication of a piece done by the Sydney Developer Lukasz Karluk and the others are a reworking of some as3 pieces. I often like to replicate works I have seen as a personal challenge and to help focus my efforts.