All posts by robshearing

IGR

I was part of a team that created a campaign website for the Intergenerational Report (IGR) in 2015.  The project features 360 video sequences that transition the site between different states around Australia - like a road trip. This process of capturing these images involved scraping sequences of high res-panorama image tiles from Google. I used some javascript to extract panoramaID's from Google streetview  into a JSON object and then built an Air app to download, compile and save out the image tiles from each pano location.
I had a bit of fun with these images and created a stereographic projection version here:

 

 

Hotline Pudding

Many agency's put out an eDM or microsite for the Agency to celebrate the season. We decided this year to do some thing a little digital and a little unusual. We made a singing Christmas pudding.

The pudding is running a BareCoductive development board. The code was substantially modified to play back a series of MIDI tracks. The MIDI tracks were re-encoded into a custom format to allow the main melody sequence to be played back by tapping the pudding. Using a capacitive touch sensor, a single wire connected to the pudding provided a good conductive interface.

The plinths - made up in my kitchen, housed an internal LED floodlight controlled by the MIDI sequencer gave active visual feedback and a small stereo amplifier provided a deep sound and pleasing haptic response.

Hotline

JingleBells

Hotline

 

 

Augmented Reality

This marked based AR iOS app was created with Unity and Vuforia. It was developed as a prototype for a media innovation exhibition at the Sydney Opera House.

In a collaboration with a 3D modeler a static poster was transformed in to a 3D scene with a virtual dragon flying into the frame and blowing smoke and flame before flying off. It one of a number of AR apps that I developed whilst at Spinifex.

 

 

demo

 

 

Acura Connected Living

This multitouch Air application features the full suite of Acura product information, presented in a rich and engaging interactive environment. Content is synchronised over the cloud making updates effortless. I was involved in a recent addition to the application, a particle effect attractor state controlled by kinect cameras. The camera was mounted below the screen and monitored the movement of people near-by. Users positions were translated to particle attractors which created a personal effect of being able to casually influence the computer graphics on the digital screens.

I used the in-spirit open source fluid solver library to create the visual effect with a few modifications to the behaviour and other visual effects. The fluid solver renders to bitmap data and performed remarkably well. This was anther re-use of an existing Air Kinect application that plugged in with minimal setup.

 

 

Scion MegaWall

The Scion MegaWall was an application that was built for the 2014 Auto show and ran in LA, Detroit and NewYork. The application was built to integrate into a video wall via a show control system (Pandora). The application communicated with the show control system via a set of custom JSON messages sent over the network on TCP. The application could be configured on-the-fly to run in a number of different modes all controlled via xml configuration. This allowed the video wall to seamlessly transition from brand vide to the application and back.

In addition to the playback sequencing the wall took a number of live feeds including live weather, rss, instagram images and data from other on-site systems. One of these systems was a vending machine that was triggered when someone activated the machine via NFC. The application could also run interrupted regardless of an internet dropouts.

 

 

Kinect Video Wall

This project was originally part of the the Cisco House pavilion and was build upon for an installation in Cisco's Global Centre of Excellence in Songdo, South Korea. This video wall is still active and is the first digital feature that welcomes guests upon arrival to the centre.

The application is controlled via gesture controls using the Kinect depth camera. The main feature is a library of Cisco brand videos divided into three categories. A user can select a video category with their left hand and then choose a video with their right hand. In addition to the video player, a user can also initiate one of two live video conferences simply by standing on a hotspot marked by a floor decal.

A second custom Kinect interpretor application was built to support the primary app. Kinect data is optimised and sent to the main application over UDP on the local network. This application has since been reused in many other kinect controlled applications.

 

 

Toyota In-Car App

This is a native iOS application developed for the US Auto shows. It features a number of product sales presentations and brand content media and runs on iPad minis mounted inside the cars. A set of videos of sales staff were produced for each car and the application seamlessly transitions between the videos including an additional idle video running on a loop. For each vehicle both english and spanish language videos were produced.

The iOS application uses a CMS to retrieve content for each vehicle and provides rich user tracking data for each interaction. A problem with networking at large public events is that WiFi cannot be relied upon when the show is running. Consequently the application has a synchronising mechanism where the iPad can be configured at the start of the show and download the required content to function offline. Tracking data is then retained locally and only sent when there is a reliable connection.

The application was published using an Enterprise licence which enables deployment via wifi outside of the appStore. The effect is that the iPads can be setup from scratch in a mater of minutes and then operate autonomously for the full duration of the event.

 

 

OohAhQR

This is a QR code scanner built in Objective-C. This app was an exercise in using Obj-C and used a number of libraries including the zxing, MapKit, CoreLocation and a QR encoder. I was also very disappointed with the quality of QR code scanners available and wanted to contribute an alternative app with more pleasing design.

Click the image to view the app in the app store.

 

 

RGBD Experiments

RGBD Toolkit is an application suite developed to produce 3D video that combines the kinect depth camera and a DSLR or webcam. Effectively it maps colour video frames onto 3D point cloud mesh. The software allows you to modify various effects such as the size and brightness of the point cloud nodes, mesh lines, video frame and depth of field.

It is a fairly painful process to build a rig the couples the kinect with a camera and then calibrate the two lenses. However once the lenses are calibrated then capturing sequences is easy. The toolkit was built in OF and has a calibration, capture and edit components.

The software was later used by the motion graphics team in a Samsung Smart TV launch.

 

 

Lost Diggers

In February 2012 thousands of glass plate negatives were found in an attic in France. These images had been captured over 90 years earlier during the First World War and showed soldiers at rest in the village of Vignacourt during leave from the front. The Australian plates were bought by Kerry Stokes and donated to the Australian War Memorial.

This applications was build in actionscript and used the Starling framework. Images were stored locally however the application used webservices to update comments and submit comments. A requirement of the project was that it had to run offline, so all successful url requests were cached locally. Also any user generated comments were stored locally and synched to the backend whenever an internet connection became active.

For me personally this project had significance as my great uncle is buried in the Vignacourt War Cemetary and may well have been one of those lost diggers. I later visited the town of Vignacourt where the photographs were taken to pay my respects.