My initial research involved investigating projects done by others. Since the original Xbox Kinect came out in 2010, there have been lots of projects exploring the use of it with music however many of these works came out long ago and the progression of technology has meant that old features and programmes have been updated with newer and more advanced replacements.
Emily Howell is a computer program created by David Cope, an American music professor. It is an AI that builds and plays its own musical compositions from a source database and also attempts to hear feedback from its listeners to sculpt its own "personal" style. I think it’s fascinating to compare how the AI makes the music sound, which is sometimes quite disconcerting, and my compositions, which, although partly computer generated, still have a human connection through the interface of the Kinect.
This was the original video I saw that confirmed for me that it was possible to use the Kinect as an interface device. Although the software used to compose this piece is now discontinued, it still showed me some of the basic principles required to get this kind of project to work.
As·phyx·i·a is an experimental film created with Frederico Phillips and performed by Shiho Tanaka. It was made using the 3D data of a woman dancing captured using two Kinects placed opposite each other, then combined within a 3D application and rendered along to music. This project was made using the Kinect v2, which has twice the resolution in the depth sensor than the Kinect v1 and captures data at 300 fps which is then sent into the PC and processed to become a regular 30 fps output.