Earlier this week we caught this psycho-3D-feedback music video by WOLFSHIRT on vimeo described as “A visualization of a signal path through a computer” and had to drop them an email asking for more information on how they put it all together and the inspiration behind it.
Watch the video and then see below for their reply!
Hey folks, we are WOLFSHIRT, an audio / video group out of Brooklyn, NY. The two of us work with various tools to create chaotic psychedelic pieces and real-time music videos. For our latest project, DATA, we wanted to build a strong framework for integrating 3D imagery (Unity), Syphon, VDMX, and Ableton Live with each other. To do this, we used Max as our primary medium for conversation between the different programs.
The end result of the video is essentially a mix of VDMX’s processing tools with a Syphon grab of Unity’s player. I’ll first explain what’s happening on Unity’s end. The ground plane is built with an infinite terrain generator. This allows our character to walk, run, and sprint forward endlessly. It has a small noise algorithm with creates some depressions in the surface level here and there, but for the most part, it remains fairly flat. On this terrain, we’re also generating 3D objects at random around the character. This is done between Max and Unity, with a quick patch set up to tell Unity to generate an object from an array each time we send it a signal. We also send signals through Max for audio-reactivity. For example, on a big change in melody, we’ll have Ableton Live (our audio host) send a signal over ethernet to VDMX to invert a color channel, or tell Unity to switch camera positions. This networking between the different programs, and also separate laptops, is done via OSC protocol, which allows for fast reactivity in Unity and VDMX to the audio.
Another big aesthetic for the video is feedback looping, which we accomplish through connecting Syphon’s output back into its input. This gives us a trailing effect, which we modulate in real-time for different aesthetics. In the left screenshot above, you can see what VDMX receives via Syphon from Unity. This is Syphon’s live cam of the Unity scene. Behind the terrain and character, we have a plane which we texture with the Syphon feedback. You can see the effect in different areas of the video, which creates chaotic echoes of the character, terrain, and the objects surrounding him. On the right, you can see a more finalized output, with the feedback integrated into the the Unity scene, and VDMX texturing the skin on our character.