Category Archives: Experiments

3:03 Friday

At MullenLowe we put out an internal weekly eDM for all staff. I designed and managed the eDM and created a unique header animation for each issue. I used this as an excuse for some creative coding and for most of the issues I used Canvas and WebGl effects as a basis for the looping gif animations.

These are some example executions, click on each image for a see the effect:



Accelerometer experiment

This is a demonstration of a WebGL effect controlled by an Arduino. The objective was to pipe realtime data from a MEMS accelerometer through to a streaming data socket that javascript on a webpage could use. Using an WebGL GLSL shader, realtime accelerometer data is passed into the vertex shader as a data attribute to modulate a Simplex3D Noise function. It has the effect of visually reacting to any nearby vibrations such as those created by a person walking by. The tech stack includes a number of technologies to achieve the effect:

  • Accelerometer (ADXL345)
  • I2C Data protocol
  • Arduino
  • USB Serial Data
  • Node (Express, SerialPort,
  • WebGL (GLSL)





MapBox WebGL

I have done many executions with maps including creating a number of custom map tile renderers, however using an API is always easier. Google maps are dominant on the web but I try to use MapBox whenever I can. The api allows shapes to be extruded up from the map surface. This is typically used for buildings and landmarks but it can also be used with data. In this experiment I’m using the GDP data for Thailand. It includes data at different region levels which is linked to map zoom levels.

Clearly there is a strong city/country divide in Thailand.



Audi Q5

I created this 3D scene for the Audi Q5 as a side project during some down time at work. The scene is created in Three.js and the model was purchased from Turbosquid. For a tech demo, this was used with a Leap Motion controller to use your and to interact with the camera.



Smartest A4 Paper

This a prototype for a digital engagement that allowed a piece of plain paper to interact with content on a mobile device. Data was transferred from electronics embedded in the paper to the app via audio encoded message known as Chirps. This was decoded using the Chirp SDK. The circuit was designed to be only 3 mm thick and be embedded in the spine on the edge of the paper. Engineers were engaged and manufactures in China identified.





This dancing cockatoo was in response to a challenge to replicate Pablo the Flamingo. There are a few tings goin on here to achieve the effect. The rendering is done with Three.JS and I used Blender to rig theĀ bird modelĀ from a flat image. The motion was achieved by mapping vertices to a "soft-body" structure using Physics.JS. It is possible to change the audio that the bird dances to. To get the volume levels from the audio I preprocessed the audio track with Sound.JS, this creates a bitmap image where each pixel colour corresponds to a volume level at a time point in the audio track. This bird briefly appeared in the contact us page of the old 303 Lowe website.
This sketch demonstrated how the bird is modeled on a soft body physics structure.



Hotline Pudding

Many agency's put out an eDM or microsite for the Agency to celebrate the season. We decided this year to do some thing a little digital and a little unusual. We made a singing Christmas pudding.

The pudding is running a BareCoductive development board. The code was substantially modified to play back a series of MIDI tracks. The MIDI tracks were re-encoded into a custom format to allow the main melody sequence to be played back by tapping the pudding. Using a capacitive touch sensor, a single wire connected to the pudding provided a good conductive interface.

The plinths - made up in my kitchen, housed an internal LED floodlight controlled by the MIDI sequencer gave active visual feedback and a small stereo amplifier provided a deep sound and pleasing haptic response.






RGBD Experiments

RGBD Toolkit is an application suite developed to produce 3D video that combines the kinect depth camera and a DSLR or webcam. Effectively it maps colour video frames onto 3D point cloud mesh. The software allows you to modify various effects such as the size and brightness of the point cloud nodes, mesh lines, video frame and depth of field.

It is a fairly painful process to build a rig the couples the kinect with a camera and then calibrate the two lenses. However once the lenses are calibrated then capturing sequences is easy. The toolkit was built in OF and has a calibration, capture and edit components.

The software was later used by the motion graphics team in a Samsung Smart TV launch.




These are some experiments I made with Openframeworks. I used the book "Interactive Design" by Josh Noble to get me started. The first example is a replication of a piece done by the Sydney Developer Lukasz Karluk and the others are a reworking of some as3 pieces. I often like to replicate works I have seen as a personal challenge and to help focus my efforts.