Today we're joined by the fantastic Wiley Wiggins who in addition to working as an actor and animator for film (of Dazed and Confused, Waking Life, Computer Chess fame) AND as an interactive designer (currently working on an adventure game called Thunderbeam), is also an amazing visualist. For this post Wiley has written up his history of getting into performing live visuals and how things have changed in just the last few years as technology rapidly changes.
(also check out Wiley's guest tutorial on using VDMX alongside Lumen)
When I first started doing projections it was the typical pre-rendered mish-mosh of after effects and stock footage played off of DVD. Gradually the projects I worked on got more and more ambitious, incorporating midi cues from bands playing live, lighting control, live video feeds, and allowing audience members access to some of the controls. I prefer making DIY interactive installations and collaborating with other musicians and artists to being used as club wallpaper, and I'm always learning new things. Some of my favorite projects that I've worked on with VDMX were two large shows with The Octopus Project- Hexadecagon in 2010, where we cobbled together a multi-projector show with duct tape, osc over ethernet and some Matrox DualHead2Go's back before Macbooks had thunderbolt. We experienced all sorts of weird hiccups back then, like when we discovered that our videos were choking because the speaker cabinets were making our hard drive speeds drop and we had to switch to SSD's. Just five years ago we had to use tangled quartz compositions to trigger animations with alpha from midi signals, something that VDMX could easily handle on its own now with modern hardware.
One of the things I love about VDMX is the ability it gives you to improvise. If something isn't feeling right, you can change gears on the fly. Videoinstumentalism is so satisfying, even when I make messes, they tend to be interesting messes. The ability to experiment in real time and save triggerable presets is now deeply ingrained in the way I work creatively.
Also Big shout out to Dan Winckler who helped out with the supplemental Max and Quartz Composer projects on Hexadecagon, and the innumerable contributions from The Octopus Project who are amazing motion graphics artists and learned VDMX right along with me on each project.
The last big project I did was called Shapes (and other shapes). It was a collaborative experiment with The Octopus Project and artist pal Katie Rose Pipkin. Randomly generated passages of a slowly translating text are printed out for a human actor to read, while projections mapped around the room and on a giant rotating cube illustrate something like an abstract creation myth. The band plays unseen except for occasional backlit shadows behind a screen.
The most ridiculous thing we tried to do was map video onto a rotating cube without any kind of kinect/computer vision/fancy solution. At the base of the cube was a potentiometer and arduino hooked to a homemade extra long usb cable made out of speaker wire. The Arduino sent midi to vdmx, which in turn animated a 3d cube quartz composer patch, which is angled and deformed in such a way that it looks like it's mapping onto the sides of the cube. It definitely wasn't perfect, but it was good enough! We also had a fun gimmick where whenever the text generator printed out "Alokwoim said:" the actor would press a Powermate which would then put a vocal transformation on his voice, and an audio analysis plugin in vdmx would then animate a layer projecting on the cube, making it look like a face was talking. Unfortunately since it was a random chance of it ever occurring, I think only two out of all of the performances we did ever had this happen.
I wanted some cool rolling landscapes in part of the projections, so I asked my friend Fernando Ramallo if he would include a syphon server in he and David Kanaga's beautiful art game Panoramical, which then ended up becoming the Panoramical pro version. Everybody should check it out!