Exploring the new Blur Faces and Face Overlay FX in VDMX

Welcome to this tutorial, where we dive into face-specific effects in VDMX powered by Apple’s Vision SDK and CoreML. We’ll explore how to blur faces, create face overlays, and experiment with pixelation to build dynamic, real-time visuals.


1. Blur Faces

Function: Automatically detects and blurs faces.

Customization:

Adjust blur intensity and radius.

Crossfade to isolate the face or invert the mask.

Example: Perfect for anonymizing faces or adding a dreamy, surreal aesthetic to your visuals.

2. Face Overlay

Function: Duplicates and stacks faces onto other layers.

Usage:

Combine with live input or pre-recorded footage.

Adjust size, position, and blend modes for unique results.

3. Pixelate Faces (Beta)

Function: Pixelates detected faces.

Note: This effect is experimental and may glitch with multiple faces.

Potential Use: Add a retro, 8-bit aesthetic or obscure identities in a stylized way.

4. Creative Stacking and Modular Effects

Layer Count: Add unlimited layers to compound effects.

Experiment: Stack effects, tweak settings, and discover unique combinations.

Wrap-Up

VDMX’s modular approach lets you craft complex visual experiences, perfect for performances, installations, or experimental art. If you create something cool, tag the team on social media (Instagram / YouTube) or share your work in the forums!

Happy experimenting! 🚀

Eurorack & Live Coding Guest Tutorial with Sarah GHP!

For this guest tutorial we are joined by Sarah GHP for a deep dive behind the scenes look at her setup connecting a variety of different video worlds using window capture, Syphon, and digital to analog conversion. You can also read more about her creative process, how she got into feedback loops, and more in the Interview with Sarah GHP! post on our blog.

Watch through the video here:

And follow along below for additional notes and photos of the rig and how everything fits together!

How and why to connect your VJ app output to an analog synth

In my practice — whether performing visuals live or creating footage for an edited video — I pull together a number of variously processed layers, which I want to overlay and manipulate improvisationally.

Some things computers are great for, like making complex graphics or applying effects that are more accessible digitally, in terms of both device footprint and complexity. For other aspects — tactile improvisation, working with signals from modular musicians, video timbre — an analog synth is the better choice.

My performance chain aims to make both accessible at the same time in one system.

Within this setup, VDMX plays a keystone role: adding effects, routing signals throughout the system, making previously recorded footage available for remixing, and even recording footage. It can also help fill space for modules that one has not yet been able to buy, which is the focus of another tutorial on this this site..

Here I walk through how my setup works as inspiration for one of your own.

List of gear

Computer & Software

I use an Apple M1 Macbook or sometimes an M2 Macbook. I livecode SVGs using La Habra, a Clojurescript + Electron framework I wrote, and sometimes use Signal Culture's applications, especially Framebuffer and Interstream. And of course, VDMX.

Video Signal Transformation

To transform the video signal from the HDMI that exits the laptop into an analog format accepted by the Eurorack setup, I use two BlackMagic boxes: HDMI to SDI and SDI to Analog, which can output composite (everything over one wire) or component (Y, Pb, Pr). Sometimes here and there I see a converter that will do HDMI to composite directly, but having two converters can be useful for flexibility. The biggest downside is that the two are powered separately, so I end up needing a six-plug strip.

It is also possible to skip all of these and point an analog camera at a monitor to get the video format you want, but in that case, you need a separate monitor.

Eurorack

This is my case on Modular Grid. The top row is the video row and the most important. In this example, I am focused on using the LZX TBC2 and LZX Memory Palace, plus the Batumi II as an LFO.

The LZX TBC2 can work as a mixer and a gradient generator, but in this setup it is mostly converting analog video signal to the 1V RGB standard used by LZX. It can be replaced with a Syntonie Entree. Likewise, the video manipulation modules can be replaced with any you specifically like to use.

Output, Monitoring & Recording

Finally, there is the output, monitor and the recorder. The monitor is a cheap back-up monitor for a car. (For example only.) Usually the power supply needs to be sourced separately and I recommend the Blackmagic versions, especially if you travel, because they are robust and come with interchangeable plugs.

When performing without recording, the main output can be sent through any inexpensive Composite to HDMI converter. The one I use was a gift that I think came from Amazon. Some venues used to accept composite or S-Video directly, but these days more and more projectors only take HDMI or are only wired for HDMI, even if technically the projector accepts other signals.

When recording, I format the signal back into SDI through a Blackmagic Analog to SDI converter and then send it to a Blackmagic HyperDeck Studio HD Mini. This records on one of two SD cards and can send out HDMI to a projector.

Getting the hardware set up

The purpose of the hardware setup is to convert video signals from one format to another. (More detail about how this works and various setups can be found in an earlier post I made.)

Don’t forget the cables!

The general flow here is computer > HDMI to SDI > SDI to Analog > TBC2 > Memory Palace > various outputs.

Setting up the software

Software flowchart

Those are the wires outside the computer. Inside the computer, there is a set of more implicit wires, all pulled together by VDMX.

My visuals begin with La Habra, which I live code in Clojurescript in Atom. (Even though it is dead as a project, Atom hasn't broken yet, and I wrote a number of custom code-expansion macros for La Habra, so I still use it.)

These are displayed as an Electron app.

The Electron app is the input to Layer 1 in VDMX.

In the most minimal setup, I add the Movie Recorder to capture the improvisation and I use the Fullscreen setup and Preview windows to monitor and control the output to the synth. I have the Movie Recorder set to save files to the media bin so that if I do not want to record the entire performance, I can also use the Movie Recorder to save elements from earlier in the set to be layered into the set later.

One perk of this setup, of course, is that I can apply VDMX effects to the visuals before they go into the synth or even in more minimal setups, directly into the projector.

Sometimes it is fun to use more extreme, overall effects like the VHS Glitch, Dilate, Displace, or Toon, to give a kind of texture that pure live-coded visuals cannot really provide. I used to struggle a bit with how adding these kinds of changes with just a few button clicks sat within live code as a practice, since it values creating live. But then I remembered that live code musicians use synth sounds and samples all the time, so I stopped worrying!

Beyond making things more fun with big effects, I use VDMX to coordinate input and output among Signal Culture applications, along with more practical effects that augment the capabilities of either the analog synth or another app.

So, for example, here Layer 1 takes in the raw La Habra visuals from Electron, pipes this out of Syphon into the Signal Culture Framebuffer, and then brings in the transformed visuals on Layer 3.

I also usually have the same La Habra visuals in Layer 5, so that if I apply effects to Layer 1 to pass into the effects change, Layer 5 can work as a bypass for clean live coded work should I want it. This same effect can be achieved with an external mixer, but using VDMX means one less box to carry. It also gives access to so many blend modes, including wipes, which are not available in cheaper mixers.

Use the UI Inspector to assign keyboard or MIDI shortcuts to the Hide / Show button for each layer.

I pair the number keys with layer Show/Hide buttons to make it easy to toggle the view when I am playing.

In this setup, I am more likely to use effects that combine well with systems that work on luma keying, like the Motion Mask, or use VDMX to add in more planar motion with the Side Scroller and Flip. Very noisy effects, such as the VHS Glitch, are also quite enjoyable when passed into other applications because they usually cause things to misbehave in interesting ways, but even a simple delay combined with layers and weird blend modes can augment a base animation.

At this point, astute readers may wonder, why make feedback using a VDMX feedback effect, a Signal culture app, multiple VDMX layers plus a delay, AND an analog synth like the Memory Palace? The answer is simple: each kind of feedback looks, different, feels different, and reacts differently. By layering and contrasting feedback types, I feel like we are able to see the grains of various machines in relationship to one another, and for me that is endlessly interesting. (Sometimes I bring in short films from other synths that cannot be part of the live setup as well, and that is usually what goes in Layers 2 and 4.)

Layers 2 & 4 in VDMX

Where and how effects are applied of course also affects how they can be tweaked. When I define effects in VDMX that benefit from a changing signal, especially the Side Scroller and Flip, I use the inbuilt LFO setup. I have one slow and one fast one usually and define a few custom waveforms to use in addition to sine and cosine waves.

Final setup in VDMX

The choice between computer-generated signal and analog signal is mostly decided by where the effect I am modulating lives. When it comes to effects that are available both on the synth and in the computer, the biggest difference is waveforms from the synthesizer they are easier to modulate with other signals, but harder to make precise than computer-based signals.

Setting up the synth

Now that we have set up the software to layer live computer based images and all the converter boxes to get that video into the Eurorack, the last step is setting up that case.

Synth flowchart

Mostly I work with the LZX Memory Palace, which is a frame store and effects module. It can do quite a lot: It has two primary modes, one based around feedback and one based around writing to a paint buffer, and can work with an external signal, internal masks, or a combination of the both. In this case, I am working with external signal in feedback mode.

To get signal into the Memory Palace, it needs to be converted from the composite signal coming out of the Blackmagic SDI to Analog box into 1V RGB signals. For this, I use the LZX TBC2. It also works as a mixer and a gradient generator, but here I use it to convert signals. On the back, it distributes sync to the Memory Palace.

Memory Palace + Batumi

And this is where the last bit of the performance magic happens. The Memory Palace offers color adjustment functions, spatial adjustment functions, and the feedback control functions, including thresholds for which brightnesses are keyed out and what is represented in the feedback, as well the number of frames repeated in the feedback and key softness. To dynamically change these values, LZX provides inbuilt functions; so for instance the button at the bottom of the Y-axis shift button triggers a Y scroll, and then the slider controls the speed of the scroll. However, the shape of the wave is unchangeable.

That is where the CV inputs above come in. Here I have waves from the Batumi patched into the X position, and I can use the attenuator knobs above to let the signal through.

Once everything is humming away, the Memory Palace output needs to go into a monitor and whatever the main output is. In theory, the two composite outputs on the front of the Memory Palace can be used, but one is loose, so I use one and then use the RGB 1V outputs into the the Syntonie VU007B (A splitter cable would also work or a mult, but I already had the VU007B.)

One output goes into the monitor, a cheap back up camera monitor. The other goes into the projector directly or into a Blackmagic Analog to SDI box and then into the Hyperdeck for recording, before being passed via HDMI to the projector.

While I use one big feedback module, LZX and Syntonie, as well as some smaller producers, make video modules that are smaller and do fewer things alone. These tend to be signal generators and signal combinators and, following the software to synth section of this tutorial, you can use any of them.

What It All Looks Like Together

Now that we've connected everything up, let's see it what it looks like performed live!


Enjoyed this guest tutorial from Sarah GHP? Next up you can check out the Interview with Sarah GHP! post on our blog to see even more of her work!

Using VDMX as a Step Sequencer and LFO for Euroracks

One of the most fun aspect of using Eurorack setups is the ability to quickly reroute control data and sound between different modules. Conversely one of the most limiting parts of using Eurorack setups is the ability to quickly swap out different modules from your rack to get different kinds of control data and sound coming and going from your system. In this tutorial we will look at how the Step Sequencer and LFO plugins in VDMX can be used alongside Eurorack setups to provide a versatile approach to generating CV values.

As Eurorack modules are also often a significant investment of money, it can also sometimes be useful to use software tools like VDMX to simulate their abilities to determine if they are a good fit for your needs before purchasing.

Overivew

This tutorial is broken into three main parts:

  1. Setting up our Eurorack to convert MIDI to CV.

  2. Setting up VDMX to send MIDI to the Eurorack.

  3. Configuring step sequencer and LFOs in VDMX to control parameters on our Eurorack.



Setting Up A Eurorack To Receive MIDI to CV

Univer Iter MIDI to CV and Tiptop Audio Buchla 258t Eurorack modules.

For this initial demonstration of doing MIDI to CV we are using the Noise Engineering Univer Inter along with a Buchla & Tiptop Audio 258t Dual Oscillator module to generate tones.

The Univer Iter has 8 CV out ports along with a USB port which can be directly connected to a computer for receiving incoming MIDI. Within applications like AudioMIDI Setup and VDMX it appears as a standard MIDI output device option. It also can be configured to use a custom MIDI mapping as needed and can be daisy chained with a second module for another 8x outputs.

A variety of different modules are available for taking MIDI data in one form or another and converting it to CV. As always with Eurorack setups it is prudent to spend some time looking at all of the module options and picking the best for your specific needs.


Setting Up VDMX To Send MIDI Output

Most user interface controls in VDMX such as sliders and buttons can be configured to directly send their current value as MIDI output using the “Send” tab of the “UI Inspector” window. When configuring VDMX to drive external devices such as a Eurorack it is often useful to add a “Control Surface” plugin with customized set of UI elements that represent each of our individual CV outputs.

Steps:

  1. Use the “Plugins” tab of the “Workspace Inspector” to add a “Control Surface“ plugin to the project.

  2. Use the sub-inspector to add one or more UI elements (sliders, buttons, pop-up menus, etc). to the control surface interface.

  3. Click on each UI element in the Control Surface main window to inspect it. Use the “Send“ tab of the “UI Inspector” to configure the MIDI mapping and output device.


Configuring Step Sequencer and LFOs in VDMX To Control Eurorack Parameters

Now that our Eurorack is receiving MIDI from VDMX and converting it to CV we can begin to set up our Step Sequencer and LFO plugins to drive individual parameters of our synthesizer.

A VDMX setup with a two track step sequencer, an LFO, a clock plugin, and a control surface configured to send MIDI output.

Steps:

Right-click on sliders and buttons to assign data sources.

  1. Use the “Plugins” tab of the “Workspace Inspector” to add a “Step Sequencer“ plugin and an “LFO” plugin to the project.

  2. Use the sub-inspector to customize Step Sequencer / LFO configurations as needed.

  3. Right click on output UI elements in the Control Surface or use the UI Inspector to route generated control data to our MIDI outputs.

  4. Patch the MIDI module CV output to synthesizer input parameters.

  5. Use the “Clock” plugin to adjust the overall BPM.

Once we’ve created our parameter routings on the Eurorack we can also optionally further customize our Control Surface with appropriate labels and display ranges, or continue to leave them as generic 0-1 values that are commonly re-patched on the fly.


Akai APC40 MK II 2-Channel VJ Mixer template for VDMX

Templates are a great way to get started with VDMX and with this template you can take an out of the box APC40 MKII and jump right in!

VDMX APC40 MK II Layout Template

A few things to note about the APC40 MK II before we get started.

The APC40 MK II has three internal MIDI mapping modes.

  • Generic Mode (Default)

  • Ableton Live Mode

  • Alternate Ableton Live Mode

To use this template correctly, you’ll need your APC40 MK II to be set to the default “stock” Generic Mode. More information about these modes can be found here (PDF) Bottom of Page 10.

APC40_MK_Neil_LayerChange.png

When you first turn on the controller, it will default to the correct button mapping. To reset the template to all defaults, it is recommended that you hit this button when you start the template to eject all clips and set everything to its default.

This button ejects all media, clears all the FX and syncs the LFO view to the LFO slider. (Warning: You’ll lose FX in Layer A and B if you don’t save them as a new FX chain).

This button ejects all media, clears all the FX and syncs the LFO view to the LFO slider. (Warning: You’ll lose FX in Layer A and B if you don’t save them as a new FX chain).

Not all buttons are RGB. When clips are ready to be triggered in your media bin, the 40 RGB button grid will light up blue, then yellow when the clip is selected. You can customize these colors yourself in the media bin options:

Screen Shot 2021-09-23 at 10.28.39 AM.png
Image found on page 10, Akai communication protocol manual.

Image found on page 10, Akai communication protocol manual.

There are two versions of this template. A blank version without FX and a starter version with one layer of FX presets.

Default setup.

This template is structured to be a 2-channel video mixer. Both video layer A and B flow to a Master output (Projector, TV, etc.) The cross fader blends between both layers and each layer has its own FX chain presets.

The Master output FX are turned on and off by the top 8 rotary knobs. The first vertical slider on the right side of the controller labeled “MASTER” controls the master opacity. If it is all the way down, your screen output will be black. You can change this later to preference or disable it entirely.

Selecting clips for both layers A and B:

Both layers use the same 40 RGB button grid to trigger clips. To switch between Layer A and B, when selecting clips — use the first two buttons on the top right side of the grid under the label “SCENE LAUNCH” They will light up when they are selected. Top goes sets destination for Layer A, bottom for Layer B. The two buttons beneath that (Green) are page up / page down buttons for moving through your media bin. They are also linked to your Audio Analysis Filter 3 and will flicker based on your computers mic peaking. Beneath that (Yellow) is a random clip trigger.

To trigger to the next clip in the media or move up and down the media bin, redirect your eyes to the “BANK SELECT” 4 button arrow keys.

The rest of the buttons should be self explanatory based on the image above, or you can read through the “User Notes” built into the template which explains all of this and more.


Template Tip!

If you’re adding new FX to your A and B layer FX chains, make sure to save them as a preset by clicking the + in the top of the FX window. This will save your FX chain and you can assign it to a new FX preset button. You can always disable the FX layers MIDI triggers in your project until you build out the template more to your liking!


Here’s a brief overview video of this template:

Using the OSCQuery Helper and MIDI OSCQuery Helper tools with Max

One of the most powerful tools for working with MIDI and OSC control data is Max, which is widely known for its easy to use interface for “patching” and working with data streams. While Max does not yet support OSCQuery natively, it is a great example of how the free OSCQuery Helper and MIDI OSCQuery Helper tools can be used to publish OSC and MIDI parameters from Max patches so that they can be remotely accessed by other software like VDMX and the OSCQuery Browser.

In this set of tutorials we’ll look at the process for adding basic MIDI and OSC inputs in a simple Max patch and then creating a JSON file that describes the routings. Once those are prepared we can see how to access these parameters using other software in the OSCQuery ecosystem.

Read More

How to control an Ableton Live project from a web browser (and other software) in about a minute

The OSCQuery Protocol is a new specification that allows live performance tools to automatically communicate its parameters for rapid setup and improvisation between performers. Along with native support within VDMX here at VIDVOX we have developed several useful utilities that make it possible for people to take advantage of these new capabilities with software that support MIDI and OSC.

In this introduction tutorial we’ll be looking at how to use the free (and open source!) MIDI OSCQuery Helper utility to publish parameters from an Ableton Live project so that they can be accessed as browsable OSC parameters from other software such as VDMX. The MIDI OSCQuery Helper also includes its own built in Interactive Web Interface which can be loaded in web browsers on desktops, laptops, smart phones and tablets to remotely control any published controls.

Read More

Receiving NDI® Audio/Video Streams in VDMX

The NDI® protocol from NewTek is a way to publish and receive audio / video streams over a network as a way to share live feeds between systems. From within VDMX, any number of video streams can be both output to the network and input from other applications.

In this tutorial we'll looking at capturing NDI® video streams that are published from other applications on the network and use them as the source for a layer. More information can be found in the VDMX manual in the section on video inputs.

Read More

Using OSCQuery In The Control Surface Plugin

The Control Surface is one of the most widely versatile plugins in VDMX, making it possible to create sets of custom interface elements that can be used to control nearly any aspect of your workspace or send MIDI / OSC / DMX to other systems. The Control Surface plugin also has the ability to publish its list of parameters over a local area network using the OSCQuery protocol so that other software can remotely browse and control almost any aspect of your VDMX project.

In this video tutorial we'll be looking at the basics of using OSCQuery protocol from within the Control Surface, and three ways that those parameters can be accessed from software running on other devices: using our free OSCQuery Browser utility, another copy of VDMX and a web browser running on an iPhone.

Read More

Using the Built-In VDMX OSCQuery Browser

The OSCQuery protocol makes it easy for software that supports OSC to access each others parameters for remote control, without a lengthy setup process. Within VDMX there are a few ways to take advantage of this and in this tutorial we will focus on using the built-in OSCQuery Browser window which can be used to browse the address space of a server, send OSC messages and add OSC sending elements to our workspace.

The built-in OSCQuery Browser Window can be opened from the Window menu or by using the cmd+5 keyboard shortcut. From this panel you can access, browse and search the namespaces of other applications. For each of the listed OSC address destinations at the remote server you can:

  • Use the provided interface control to quickly send test data.

  • Dragged the listed item on to UI elements in VDMX (such as sliders, buttons, and color wheels – this also works with the list of variables in the Cue List plugin inspector) to automatically configure OSC sending to the remote hosts.

Read More

Creating a 'Falling' audio level data-source using number FX chains in VDMX

Along with the basic controls of inverting values and applying basic math equations, number FX chains can used to adjust the values of data-sources before they are applied to sliders. In this example the 'Fall' FX will be applied to an audio analysis level to create a falling style before being applied to a VU meter generator.

Read More