On February 17th 2018 I teamed up with VJ Projectile Objects to make live visuals for Cosmic Sound’s event Altared I. We worked on the same visuals simultaneously. I live coded abstract shapes to the sound of the music, and he got that as an input and remixed it through vdmx with some other visuals he made or were already in there. I found that this was really beneficial to my flow. I didn’t need to worry about a crash and people yelling at me (yes that has happened) If I wrote a bum line of code, Neil could transition to some of the graphics in the VJ software while I got back on my feet. His writeup is much more detailed in terms of the hardware setup.
J Butler made a great video of his whole set:
This time I decided to try out KodeLife from hexler.net. This is a new software and doesn’t have much documentation, but that wasn’t a hindrance because it was extremely intuitive. Much like Veda, it passes in the input audio as volume and spectrum, as well as mouse, time, textures etc. It has the added bonus over any other live coding software I’ve used of accepting syphon (if you’re on mac) / spout (if you’re on windows). It took a bit of playing around to find, but it was under shader stage -> properties -> add parameter -> constant -> shared image. There is a lot of other features I have yet to explore like the shader passes.
Kodelife also has the ability to enable Syphon Server in preferences, which is how we got my visuals to Neil’s MacBook pro. This was done through Mad mapper. I ran KodeLife and Mad Mapper on a ASUS ROG Strix GL702VS, Neil grabbed my output via mad mapper on his computer and ported it into vdmx, then that signal went out back to my computer through mad mappers, then from my windows spat out the final projection, his writeup can describe the setup much better.