I n c e p t i o n

To experiment with using the Kinect in alternative ways, I looked at what I could use the various video outputs for. Online I found a YouTube tutorial explaining how to get midi notes based on the luminosity of a video. I then built this myself and adapted it to work with the camera output from the Kinect. These notes then cause the glitch effects, placed on the video feed, to react and change the video generated which acts as a feedback loop changing the notes even more.

The drums are triggered by the video as well, each time the note generator outputs a midi note, it also sends a bang to a “random” object on each individual drum which is then compared to another constantly changing random number creating a fully generative drum machine.

The left hand controls a harp like synth fully controlled by the user. The X axis changes the interval between each note, the Y axis plays different pitches and the Z axis controls the velocity of each note.

This sub patch takes two video inputs and mixes them together using jit.gl.pix the amplitude of a signal input then adds on to one of the video feeds making it more dramatic at higher amplitudes. This is then sent into the second glitch sub patch.

I downloaded this Max patch from chrisvik.com, it modulates the video based off of audio amplitude using noise and texture generation to distort the mesh. I tweaked the parameters to produce the effect that I wanted. I also added automation to the rotation which I can activate when pressing the right arrow key.