8Q SCHIZOPHONIA


NOTE: THIS VIDEO CONTAINS STROBING VISION

For my project I wanted to explore soundscapes for abstract videos. In particular I wanted to find a way to create sounds from the video image through the use of algorithms.

I went searching on the internet for a script to use in After Effects and found this Expression by Todd Kopriva :

targetLayer = thisComp.layer(thisLayer.index+1);
samplePoint = targetLayer.effect(“Point Control”)(“Point”);
sampleRadius = [1,1];
sampledColor_8bpc = 255 * targetLayer.sampleImage(samplePoint, sampleRadius);
R = Math.round(sampledColor_8bpc[0]);
G = Math.round(sampledColor_8bpc[1]);
B = Math.round(sampledColor_8bpc[2]);
A = Math.round(sampledColor_8bpc[3]);
outputString = “R: ” +R+ ” G: ” +G+ ” B: ” +B+ ” A: ” +A

This script uses video locators to read the RGB values of the pixels under them. The output of which is then fed into a text generator. I have reworked this function so the values generate audio. By doing this I have replaced the original soundscape with an algorithmically generated alternative – the sound is still linked to the scene, but where the original audio was associated with the scene only by being generated independently at the same time and location, now the picture has a direct link to the audio as it is its creator. You could say the video plays the audio.

I tested the script on a sample video (see below Experiments with Sound and Video) that I had laying around and found that I could attach this it to the frequency node of the Tone effect. This would set the frequency to the value of the colour sampler’s output – in this case, somewhere between 0 and 255. I then modified the value to give me a range of 20 – 15,000 which translated into Hertz.

When I got close to what I wanted I swapped the standby video for the final video and got to work on fine tuning. I have delay on some of the tracks and reverb on others. After Effects struggled with some of the early compositions so I broke them down into simpler tasks and rendered them individually.