Welcome to the page for the documentation of the development of my current project.
All the photos are from March 30th at Spirit.
In this iteration I live edited a shader that was remixing a webcam-pointed-at-projection induced feedback loop.
The next iteration I will implement a system that can support live creation, not just live editing. This will make a much simpler and maybe less interesting visualization, but for my research and to keep in spirit of live coding it needs to be done.
I also want to tone down the feedback loop, lessening the abstraction of the input data from the webcam. For the first two performers, the lighting was perfect that created the visualizations to mirror their movements and colors, unfortunately I don’t have any footage of that part of the night, only two photos. For VIA App’s set, something went weird and there as no color being picked up from the webcam, which lead me to not be able to use much of the webcam’s data, instead I let it just run, which created this black and white static that I could manipulate.
In this autonomous interactive installation, I studied the subtitles of how it responds with the audience.