Generative Dance

In this project, the dancer generates the music through her movement. While traditionally, live music is created by mixing with a midi keyboard. I wanted to experiment with a new form of interaction for music creation and the role of the dancer. Through body tracking and OpenCV, her different poses enable her to mix different beats and sounds-- allowing her movements to control the piece.

This project explores the relationship between movement and experience. When I started this project I was fascinated with embodied cognition: the idea that our physical movements can influence our cognitive experience. Typically in dance, the movement is a response to perceived sounds.  This piece investigates what would happen if the dancer was given the opportunity to choose their movements first which would then create the music. How would a “motion first” performance change the dancer’s experience and the audience’s perceived experience?

In order to bring this project to live, I had to work very closely with my user: the dancer. It took a great deal of iterating on the functionality of my code to adjust to the dancer's needs and her process.

I also had to really build my language around technology so I could properly communicate what was happening "under the hood" and collaborate with her to make the appropriate adjustments.

Behind the scenes:

When starting this project, I sampled music and edited sound bites to create the main musical elements that the dancer would be mixing. I then created the visuals which I would VJ during her performance. In order to push myself all the visuals effects were created using JavaScript animation libraries. The visual images themselves were made on Illustrator.