Inspired by Daniel Rozin and his multiple corious mirror creations, but limited in hardware I set my goal to create a virtual mirror using just a webcam. I wanted to use this visualization as a base for a music video for a friend, therefore it includes bass beat recognition that alters the beads. I chose this project because I like a change of view. To see the same thing but just a bit different.
This visualisation is created in program TouchDesigner that builds generative art using prefabricated nodes that can interact with each other and can be further altered with python expressions. This interface lets me easily create the visualisation itself, as well as UI with sliders that change the most needed variables, and a recording plugin that can record the output.
This part of the program (below) represents the two modes I use the webcam input. In the upper branch (the most used) is for displaying the webcam through a matrix of beads (either balls or cubes) and turns the video into pure luminescence values. The values of the first node of this branch can be changed through a slider which makes the video pure black and white or adds gray values for more smooth projection.
The lower branch represents a motion detecting group that combines the current frame of the video and the last 20 frames and by subtraction only displays the change between them. This is not really used in the final visualization but still can be accessed through a switch. I left it in for future work.