OOPB OOBS will be a container that holds experiments into how the physical body navigates shifting environments, or a lack of an environment all together.
I'm curious about the movement generated when the moving/dancing body interacts with a software in real time, and how an imposed visual that reacts to said movement provides a partnership between the two. Below are stills from the first few tries of using Meta Spark Studio and Java Script to create filters that attach onto a live feed camera view.
the filters were created using creative code, and coded to track the body
lots of glitches occurred: coding and learning to script feels like a whole new language
I learned having a clear background with neutral clothes supports the software in better situating itself in relation to the body
for now, moving slow is the way to go
As this week nears its end, I'm excited to be able to move towards recording a movement sequence in full to see how the coded filter continues to morph and react — I will continue to ask, how can I truly dance with the software?
Comments