Digital Prototyping 05

Translating movement into an output

This digital prototyping challenge was focussed on how we could use either our created device or some other form of tracking movement to create an artistic output. Together with Marius, Oliver, and Nuria we worked on trying to understand the way we move in our daily life and how we could translate this into sound. Imagine if your music was determined by the speed you walked at.

Training the Model

From movement to data

The first step of the process was to take the data from the phone that determined placement in the room, gps and speed and send that to wekinator. Once it was in wekinator we were able to train our model to understand 0 = walking and 1 = running.

In the previous module Citlali and also introduced us to an app you can download on your phone called “ZIG Sim”. This app allows you to access all the different sensors your phone has individually or together, from the accelerometer to the gyroscope, microphone and camera etc… and send that information over OSC messaging (among other ways) to another device to process that data.

Using this we had Mars run/walk around the room with the phone in his pocket but we soon determined that the room was too small to get accurate data and we needed to move the training to the roof so we had a larger straight space in which we could collect more accurate data.  We wanted to explore more types of movement but due to the time constraints this is what we focussed on.

What steps were taken

The process from training to the final output was quite simple but required one to go through various softwares and interactions. However, once you understood the basics you had quite a few opportunities. The goal of the project was to see “What if we could change the songs you were listening to depending on if you were running or not?”

Grabbing Data from Phone using ZIG Sim

By using the app on your phone you were able to grab a variety of data points and send that to the wekinator.. we focussed on GPS, acceleration and placement in the Z-axis. We trained the model on various peoples running/walking to account for difference in movement as much as possible.

Training Model in Wekinator

While the data was being grabbed from the phone it was also being sent to the Wekinator application on the laptop which we could then train through saying while running a person is 1 and while walking they were 0.

Sending data to Max8

Once the training was done in Wekinator, we then connected the model to Max8 and used it to send either the signal 0 or 1. 

Converting received input to sound in Max8

Using the received signal we were able to create a easy flow of code in Max8 that either turned on 1 sound or another based on this signal.

The final output

Having 2 songs change based on movement

In the end we managed to create a model and code that allowed us to have one person run or walk and have this translated into a song. This was fun to see happen because it could have implications in the way people interact with their own movement and what it could bring to the realm of dancing/running or other sports. 

Reflecting on Digital Prototyping 5

The final digital prototyping class of the masters. It was a fun and explorative class just like the past class, yet I felt more connected to this aspect then the actual creation of the wearables.

This class gave us the opportunity to better understand the possibilities of data translation in a more digital way. This is something I am interested in as I think that you need to be able to reach people in a way that they could understand various complex themes. 

In the project I worked on together with Nuria, Mars and Oliver, I had a fun time trying to understand what the thresholds of songs were and how movement of each body would result in different outputs. This key learning point resulted in us not training it only on 1 person but focussing on multiple inputs to create a more accurate result. 

In the end while this course was not directly linked to my thesis research together with Oliver, I still managed to learn a lot about such a complex and modular program (Max-MSP). It also taught me to question where data is coming from. Also in the future this idea of OSC messaging seems like an interesting method to explore how different devices could communicate with each other. For example the energy monitor we are building with your phone. Or even using the data visualization of your energy in some way using these methods learned.

© Copyright 2023 Carlotta Hylkema - All Rights Reserved

Best AI Website Creator