When we dance, we don't use only our fingers.
So why do we use finger-based keyboards to make music?
In this project, the dancer generates the music through her movement. While traditionally, live music is created by mixing with a midi keyboard. I wanted to experiment with a new form of interaction for music creation and the role of the dancer. Through body tracking and OpenCV, her different poses enable her to mix different beats and sounds-- allowing her movements to control the piece.
Behind the scenes:
In order to bring this project to live, I had to work very closely with my user: the dancer. It took a great deal of iterating on the functionality of my code to adjust to the dancer's needs and her process.
I also had to really build my language around technology so I could properly communicate what was happening "under the hood" and collaborate with her to make the appropriate adjustments.