Using control data from other applications, external hardware, and internal providers like LFOs or Audio Analysis is a major component of VDMX- every UI element is capable of being controlled via MIDI/OSC/DMX/other data sources. The procedure for doing so is consistent across all UI items- you add a receiver (which receives data from things) to the UI element you're working with using the UI Inspector.
Typically the range of numbers being received can be automatically translated by VDMX to cover the local minimum and maximum envelopes of a slider UI item. However in some situations you may want to override the default number mapping behavior by using the settings in the sub-inspector panel for receivers. In this tutorial we'll look at some of the common cases you may run into and how to handle them.
Read More
An ISF, or “Interactive Shader Format” file is a GLSL fragment shader (.fs) that includes a small blob of information that describes any input controls that the host application (such as slider, button, and color picker controls in VDMX) should provide for the user when the FX is loaded for use, as well as other meta-data including the authorship, category and a description.
In this two part tutorial we'll cover the basics of applying ISF based FX to layers in VDMX and how to install new example ISF files you may download from the Internet, followed by a quick introduction to creating your own image processing GLSL fragment shaders.
Read More
An ISF, or “Interactive Shader Format” file is a GLSL fragment shader that includes a small blob of information that describes any input controls that the host application (such as slider, button, and color picker controls in VDMX) should provide for the user when the generator is loaded for use, as well as other meta-data including the authorship, category and description.
In this two part tutorial we'll cover the basics of using ISF generators within VDMX as sources for layers and how to install new example ISF files you may download from the Internet, followed by a quick introduction to creating your own GLSL fragment shaders.
Read More
Among the real-time video generator and FX formats is an open source plugin type called FreeFrame and the newer GPU based FFGL format which uses the graphics card for faster image processing. In this tutorial we'll look at how to install these 3rd party FreeFrame plugins to use with VDMX.
Read More
For part two of our video fundamental series we'll be looking more in depth at the four main types of video sources that you'll encounter in the world of VJing and video production.
Read More
In this guest tutorial we're joined by the Rockwell Group's LAB division who work as an interactive design team within a larger architecture firm where they focus on projects that blend physical and virtual spaces.
For a recent projection mapping installation in NYC, one of the techniques used by the LAB was to apply a real-time video FX on to a specific portion of one of the pre-rendered movies so that part of the image was left unprocessed in the main output while another section was color shifted to match the lighting effects in the room. Today we'll show you how that was accomplished.
Read More
For this technique video tutorial we'll be looking at how to use VDMX to create a multi-camera video sampler setup with the ability to record movie clips from a live feed to be immediately remixed and saved for later editing. As movie clips are sampled they will be automatically added to the bin page where they can be triggered for output making this simple example useful either on its own, or added on to an existing project.
Read More
For this guest tutorial we're joined by recipient.cc who give us a behind the scenes look at the techniques used in their recent projection mapping on the Pirelli Tower in Milan for Adidas Boost taken from pre-production to design to the final implementation.
Read More
For today's guest post we're joined by eatyourwork who first introduced us to the possibilities of using OhmRGB Slim alongside of VDMX in a blog post a couple of years ago. Since then we've made a few basic templates for new video performers to get started with a simple VJ video mixer setup with the Ohm, but in this video tutorial Simas shows off the extent to which you can customize your layout and MIDI mapping when making your own video performance rig.
Read More
When getting a new MIDI controller to use with VDMX, or other VJ / music making software, one of the most exciting aspects is finding out the best way to map the sliders and buttons to various controls that you want to use during performance, and along with that coming up with new ways that you can configure your software video generators and FX to get the most out of the layout of your instrument.
In this set of technique tutorials we'll be looking at three new example VDMX setups we've come up with for the Livid Base that take advantage of the controller in a few different ways including its multi-color LEDs and pressure sensitive pads.
Read More
In this tutorial we'll be looking a closer look at using automatic BPM detection for syncing up the timing of visual events with music by recreating the core parts of the Waveclock Demo template that is included with VDMX.
The approach we'll take is to create a virtual video instrument in the form of a Quartz Composer composition and animate its interface controls with Step Sequencer and LFO plugins. Presets for patterns in each plugin can then be saved and switched to match the energy level of the music while VJing during a live set.
Read More
By enabling the “Waveclock” beat tracking feature in the VDMX Clock plugin, the music from a microphone or line input can be analyzed to automatically handle the adjustment of the BPM and measure position to ensure that the timing of changes in your video are perfectly in sync with the bands and DJs that you are working with.
Read More
Along with being able to receive real-time control values from MIDI and OSC based instruments, VDMX provides the ability to send the local state of interface items such as sliders and buttons back out to hardware controllers whose interfaces can update dynamically.
To make the setup of two way talkback with devices that support this kind of workflow faster to setup, each UI item in VDMX that is receiving from a MIDI or OSC source can be set to “echo” it's state back to the connected hardware controller.
Read More
One of the most useful sets of open source FX plugins for Quartz Composer are the v002 collection maintained by Vade and Bangnoise, which are now included as an optional separate package along with VDMX. Included are the v002 optimized fast blurs, “film” image filters, analog / digital glitch, and the Rutt-Etra analog video synthesizer emulator, along with QC based FX ready to use in VDMX or your own QC compositions.
Read More
For this guest tutorial we are joined by Alejandro Crawford, the visualist for MGMT (among other bands), in which he'll show us one part of the setup he uses for creating his live visuals by connecting a scene rendered in the powerful 3D gaming engine Unity to VDMX using the Syphon to pass video back and forth between the two different programs.
Read More
One of the data-sources available within VDMX for controlling playback, FX, and composition parameters, is the current playhead position of each movie playing on a layer. Like an LFO or audio analysis value, you can assign this to any slider, button, or other UI item by using the UI Inspector or from the right-click contextual menu.
In this tutorial the movie “normalized time” parameter (time as a percentage, ranged 0.0 to 1.0) will specifically be used to synchronize the playback of multiple movie files – this can be a useful technique for working with batches of clips that have the same duration, and high-end projects that involve powering more displays or projectors than can be connected a single Mac.
Read More
It kind of goes without saying that these days, posting your work online is a great way to promote yourself as a VJ or creative coder, and to make new contacts for future collaborations. Along with a studio or live mix of your visual work, including some of the original resources that were used during the making of your process for other people to learn from is another way to make your mark on the community.
For this technique tutorial we'll be looking at recording a demo reel that shows off the different ways that your generative compositions can be used in a live setting by using different sets of control data to drive its parameters, such as time based LFOs, MIDI / OSC control, and audio analysis data-source providers. Once we've finished creating the sample movie, we'll also walk through how to share the files using the videopong.net website where they can be hosted, downloaded and remixed by other video artists for free.
Read More
For this quick technique tutorial we've made two basic Quartz Composer compositions using the “Detection” object that can be loaded into VDMX to perform basic face capture and replacement FX that can be connected in a variety of ways. You can also use these example patches as starting points for your own patches that perform more complex behaviors like tracking multiple faces within a single frame or publishing additional control information.
Read More
This question comes to us from the VIDVOX forums, and is most easily explained with a quick demonstration– the goal is to have a MIDI knob that is used to make the video become more pixellated as it is turned left or right, but is a regular pass-through when set to its center point.
In this tutorial video we'll show off how to use an LFO plugin in VDMX to create a lookup curve for mapping a MIDI knob to a different range of values to drive our pixellate FX being applied to a layer.
Read More
In this technique tutorial we'll focus on two different ways the idea of a DJ style low, mid, high EQ control can be interpreted in the world of video as FX in VDMX as a means to mask out or adjust the gain level on separate discrete parts of a video stream for the purposes of blending video layers together.
The first example exchanges the low, mid and high levels for the individual RGB channels of the image for raising or lowering the intensity of each independently. The second qcFX uses a similar concept to a 3-band equalizer, breaking down the image into three different sections based on the luma (brightness) level of each pixel instead of its frequency ranges.
Read More