Gestural Composition with Kinect

This project looks at using video game controllers, primarily the Xbox 360 Kinect, as alternative means to interact with music production software, allowing for a freer and more adaptive composition experience not limited by traditional MIDI controllers. The project involves gaining a deeper understanding in how to use Max MSP, connecting an Xbox Kinect to a computer and getting it to communicate with different programs such as Max MSP and Ableton, composing with the Kinect, and creating live audio-visual elements to accompany the music.

First released in 2010, the Xbox 360 Kinect is from a line of motion sensing input devices produced by Microsoft for video gaming. It is made up of an RGB camera, an infrared laser projector and an infrared/depth camera along with a microphone array. This allows the device to create a depth map of your environment which can interpreted by a computer program to output a selection of data from skeleton positions to point clouds. Using the data outputted from the Kinect, I built several instruments within Max4Live in Ableton that allowed me to input MIDI notes and adjust different parameters in a more intuitive and free way than traditional MIDI devices. I then looked more into the process of composing music within Max MSP and creating audio reactive visuals to follow the sounds. This included building Max Patches with a selection of instruments and different visual elements using Jitter, a set of objects for matrix-based calculations for data).