Working again with Rob Sinclair and Nev Bull I had a brief of creating realtime audio reactive visuals using only retro, modular or analog hardware. No custom software, no renders, just real hardware patched and played just how the band make the music on stage. None of the systems we used had ever been used in real time on a large scale project like this before as far as I’m aware.
I used a modular video synth based round the LZX Industries Visual Cortex unit to create audio reactive analog video designs. The rack is pictured below and ‘patches’ had to be saved by drawing on a printout of the unit facias to show what connects to what, and what settings to use. Also in the rack was a 3trinsrgb+1c analog video synth which I used in several songs.
Also used were two prototype Ming Micros which are 8 bit digital video synthesisers that create imagary and animations rereminiscent of the original Atari 2600 games console with similar limitations. These I controlled via MIDI using a custom VDMX layout to sequence and control the parameters. Content was designed using a pixel art package and a supplied Processing sketch that converts the png images into conpatable text files which are read by the Ming.
Finally, I found some nice old oscilloscopes and rendered predefined audio to mix in with the band audio to create interesting shapes and patterns while the scope was in XY mode, this was captured live during the performance using a fish eye camera and put up on the LED wall.
First four photos courtesy of Faber AV.