Video Fundamentals Table of Contents:
- General Overview
- Video Sources
- Video FX
In the first part of this series we opened by looking at a general overview of the start-to-finish workflow that is typical to VJing and live video production, beginning with the types of source materials which then get processed by visual FX, composited with other images, and finally output as projection or a rendered out file to share online.
For part two of this series we'll be looking more in depth at the four main types of video sources that you'll encounter.
Lesson 2.1 – Movies and Pre-Rendered Media
Movies, still images and other pre-rendered media types are stored in files on a disk. Inside of these files the image data is often compressed into a format known as a codec that can be read back by a computer. Once a movie is created, the pixel dimensions of each of its video frame (called the movie resolution) becomes fixed.
Whether you are using VDMX or another VJ app or media server, you can expect to find standard playback options such as setting the rate, loop-mode, in / out points, and volume level of each movie file.
In most cases movie files can be loaded by dragging them from the Finder or using a built-in media loading window into your media bin or playlist window.
Case Study: How to Turn an Old Building into an Interactive Driving Range by Gabe Shaughnessy & Dan Cohen.
Tips & Notes:
- Standard movie playback controls include settings such as rate, current time, and volume adjustment.
- For the best quality, use your original uncompressed source movies files when exporting to a new format such as PhotoJPEG or Hap.
- The Hap Alpha video codec can be used to include a transparency layer (alpha channel) in a movie file.
- Share and download royalty free video clips with other artists on websites such as videopong and archive.org
- The most common resolution standards used for displaying video are 640x480 (Standard Definition), 1280x720 (HD 720p), and 1920x1080 (HD 1080p) so typically it is a good idea to create movie files at these sizes.
Assignment: Watch working with panoramic footage shot with the Kogeto dot
Lesson 2.2 – Live Inputs and Camera Feeds
Live inputs are cameras and other video feeds that are captured using special hardware connected to your computer. These can include web-cameras (built-in or connected over USB/Firewire), DV over Firewire, as well as higher end Thunderbolt and PCI based HDMI / SDI / HD-SDI capture devices. In most cases VDMX can work with as many number of live inputs as the hardware you are using can support.
Within VDMX the setup for video inputs takes place in the Vid In section of the Workspace Inspector window. Inputs can also be directly accessed from the Layer Source picker or by adding them to pages as clips to be triggered like any other media type.
In this demonstration we'll quickly set up VDMX as a switcher between a built-in web camera and an HD camera connected by a Blackmagic UltraStudio MiniRecorder.
Case Study: Creating Video Feedback Loops.
Tips & Notes:
- Use the Preview Window plugin to watch live camera feeds before they are visible in the main output.
- The Movie Recorder plugin can be used to capture samples from a camera feed for live remixing or later editing.
- Connected iOS devices are available as live inputs like any other web camera.
- Similar to movie files, the most common resolutions for video capture hardware are 640x480 (Standard Definition), 1280x720 (HD 720p), and 1920x1080 (HD 1080p), but you may encounter others with webcams.
Assignment: Making a multi-camera live video sampler.
Lesson 2.3 – Interactive Generators
Interactive generators such as ISF, Quartz Composer, CoreImage, FreeFrameGL, and Flash are media file types which contain instructions for creating video streams.
Many of these formats allow for custom input parameters that can be adjusted during performance and immediately taken into effect by the generator patch. This differs from using regular movie files which use a dedicate set of controls for playback.
In this example we'll be using Quartz Composer which is a developer tool from Apple that can be used on its own, as well as a variety of different ways within host applications.
Case Study: The Quartz Composer Valentines Day Example.
Tips & Notes:
- Compositions can have custom controls that change the output of the generator.
- Preview windows in VDMX can pass mouse clicks into Quartz Composer compositions to interact with sprites.
Assignment: Minuek's introduction to making Quartz Composer sources.
Challenge: Make a custom text generator composition for VDMX.
Lesson 2.4 – Syphon and Window Capture
Syphon is new standard protocol for video environments on a Mac such as VDMX, Jitter, and Processing to efficiently share image streams between applications. During a performance using Syphon sources is similar to using live inputs as sources in many ways, only instead of camera feeds the video frames are provided by other software. It can also be useful to send the output from VDMX to tools for specialized mapping or display.
To receive Syphon feeds within VDMX, use the pop-up menu in the Layer Source Controls panel, or add them to the media bin pages as clips that can be triggered like any other media type.
Within VDMX any number of layers and groups can be sent to other applications with the Syphon Output plugin.
For this example we'll demonstrate both receiving (clients) and publishing (servers) from VDMX alongside the two basic sample Syphon applications.
Case Study: Connecting Unity to VDMX by Syphon.
Tips & Notes:
- Read more about Syphon and other supported applications at http://syphon.v002.info/
- To send the 'Main Output / Canvas' from VDMX to other applications by Syphon, disable the 'Skip Canvas Rendering' option in the VDMX preferences.
- Syphon feeds can include alpha channels for layer transparency.
- Use the 'Window Capture' option for applications that do not support Syphon natively.
Assignment: Use the free ProjectMilkSyphon sound visualizer as an input for VDMX.
Challenge: Using the VDMX Window Capture with WebGL in Google Chrome.