This composition was made within Ableton completely using Max4Live patches using the Kinect to control the note inputs and automation. It's not made in real-time as I had to record each track separately and then record the automation of the effects all based on the position of my hands.
One of the problems I faced was that the MIDI inputs from Max Patch would not play any sound on the same track as the synth was on, I could not figure out a way to make it work on one track so any track that had moving MIDI notes was used two tracks, one with the Max Patch on it and the other with the synth instrument receiving the MIDI data from the other track.
To control the different parameters of the audio effects, I modified an XY pad that I found online to accept data from the Kinect. I could then map these to any dial I wanted. To improve this, I would have liked to create a single patch that could map XY and Z of both hands instead of three separate patches.
Another problem that I found was that there was a lot of random digital noise in the signal coming in from the Kinect, which caused horrible spikes and constantly changing values even when I remained still. To combat this, I added the slide object between the input from the Kinect and the output through Maxhole. I also included a system that allowed me to choose the smoothness of the slides, using another port with Maxhole to send back a number form each patch to the main patch, depending on whether I wanted it to be more reactive or more stable. Below is a video showing a comparison of the two signals (left in raw and right is smoothed).