How to Turn an Old Building into an Interactive Driving Range by Gabe Shaughnessy & Dan Cohen

For this guest tutorial we're joined by Gabe Shaughnessy and Dan Cohen of Lumenal Code for an in depth look at how to create a well executed one off event video event that involves substantial preproduction from storyboarding, to animation and fabrication, and final live performance.

Red Bull Murals is a project that pairs an athlete with an artist in a unique collaboration. Red Bull asked New Creatures to create a psychedelic, immersive experience for pro golfer Rickie Fowler in Washington, DC’s historic Uline Arena. New Creatures asked Lumenal Code to provide a story, artwork, and animations, and then to create the interactive projection mapped targets and operate them during the event.

Part 1: Telling a story

 We came up with a storyline using the Hero’s Journey Archetype, because it is the foundation of so many myths. We looked for things that inspired us, that were psychedelic without being clichè, and that would make for a cohesive story. In the end, inspired by Jules Vern, Georges Méliès, and countless nautical illustrations, we decided the ocean’s mysterious qualities made the perfect setting and we set about storyboarding the experience.

 

Initial storyboard from start to finish

We knew we wanted to use a combination of projection mapping and lighting to create an immersive storyline in the space, and we wanted Rickie to shoot at several targets that would progress him through the story.

We traveled to DC to visit the location and captured a bunch of measurements and photos of the space. Using these photos, and advice from golfers about distance, height and hole size, we devised a layout for the elements in the room and started sketching out what they would look like. We spent extra time on the silhouette of each target so we would have a room filled with interesting shapes.

 

Silhouette versions of each target

Part 2: Fabricating the projection surfaces

Once we settled on a design for the targets, and a final shape for each, we created scaled vector illustrations and sent them to a shop in DC to have them cut out of plywood and painted. Rather than paint them white, our technical director, Grant Davis, advised us to use 50% gray so we would have more depth in our shadows and better contrast overall.

Part 3: Illustrating and animating the story

After the vectors were sent off we finished the digital illustrations so we had a base painting of each target to use in the animations. The illustrations were created in Photoshop, then imported into AfterEffects for animation. We used a combination of cell animation, puppet tool, particle effects, Lux, Tsunami and a handful of other plugins to create the animations. These were all exported in mpeg4 format for sharing with the rest of the team.

Turtle head concept sketches

Turtle head digital drawing

Syncing audio to the animation:

We sent the finished animations to Anthony Olander, an audio producer here in Portland, Oregon. He dropped the animation clips into Ableton Live and came up with sounds to match each clip, then sent the audio files back to us.

Once we had the sound effects, we added them to our AfterEffects files and re-exported everything in the HAP Alpha codec so we could assemble it in VDMX.

 Animating the Moon: 

We worked with Aaron Rogosin to voice and animate the moon. After scripting out the moon’s lines, we filmed Aaron speaking them, then motion tracked the movements of his face as he talked. We  mapped the video to a high-res photo of the moon we got from NASA. Then we split the lines into single clips we could trigger with a TouchOSC soundboard and exported them as HAP Alpha.

 

Part 4: Setting up the software for mixing and mapping

Controlling a non-linear storyline line in VDMX

We had a basic progression for the story, but we didn’t know how the actual event would go down – would Rickie hit each target on the first try, or would it take him all night? Because of this, we had to make the storyline flexible. VDMX was the perfect solution for this, it allowed us to set up scenes (presets) and trigger them with OSC signals. We used TouchOSC to build our control surfaces. The controls were well labeled and large enough so we couldn’t miss. We set up two iPads, one to control the scenes and progression of the story, the other just controlled the moon. Each iPad used a different page of the same TouchOSC layout, which made updating and switching control around a lot easier.

Sliders and buttons from TouchOSC mapped to clips and playback controls in VDMX

We did this project in May, before the new TouchOSC layout import tool was available in VDMX. If I did this today, I would be using that tool instead of the method I show here, but I would still be sending the control surface button and slider values out via MIDI – I’ll explain why next.

Syncing VDMX with the GRANDMA2 lighting controller: To sync with the GrandMA2 lighting controller we used MIDI notes. The VDMX control surface elements sent midi notes to the controller for the different triggers. The notes would trigger lighting elements that had been preprogrammed to match the projections.

Controlling multiple computers in sync: To control the second additional computers, I set up an additional OSC output for each button on the VDMX control surface. This relayed to the next computer and triggered a control surface element in VDMX. The VDMX project files running on the two machines were nearly identical, but the OSC preferences for each had different input and output settings. This ensures a one-way relay from one computer to the next. The VDMX Comm Display plugin is invaluable for setting this up because it shows you all the OSC messages flying around.

Using Syphon and MadMapper: Each layer’s output was sent out via Syphon with the VMDX Syphon plugin. For each layer, I opened a separate instance of MadMapper and used the Syphon input. Greg and Grant showed me a helpful trick with MadMapper – you can install (and run) multiple instances of it – just install each instance in its own directory.

Mapping to the surfaces: Because Lumenal Code is based out of Portland, and the project was in DC, I had to do all the compositing in VDMX and MadMapper using photographs of the space. I used a photo of the room as a backdrop in MadMapper, using the technique shown here. Once we got in the space with the projectors on, one person stood next to the projection surface and we used a radio to communicate the adjustments to the mapping.

Part 5: The Big Day

We rehearsed with several other golfers in the week leading up to the actual event. None of them were able to get the ball in the targets, and we were starting to get a little worried that we made the challenges too hard. Rickie showed up and just before sunset with no idea what was in the warehouse. He did a short interview in the parking lot, then came into a dark room, with a foggy, dry-ice haze covering the floor. We had a crew of about twelve people on radio headsets, coordinating different elements of the experience, all hidden away in the darkness.

Fortunately, we made the challenges just difficult enough that Ricky was able to hit the targets. It took him a few tries to hit each one, but after trying a couple clubs and dialing in his shots, he eventually able to nail each one and complete the story.

More photos and video at http://www.augmentedart.com/