The first step of the process was to take the data from the phone that determined placement in the room, gps and speed and send that to wekinator. Once it was in wekinator we were able to train our model to understand 0 = walking and 1 = running.
In the previous module Citlali and also introduced us to an app you can download on your phone called “ZIG Sim”. This app allows you to access all the different sensors your phone has individually or together, from the accelerometer to the gyroscope, microphone and camera etc… and send that information over OSC messaging (among other ways) to another device to process that data.
Using this we had Mars run/walk around the room with the phone in his pocket but we soon determined that the room was too small to get accurate data and we needed to move the training to the roof so we had a larger straight space in which we could collect more accurate data. We wanted to explore more types of movement but due to the time constraints this is what we focussed on.
The process from training to the final output was quite simple but required one to go through various softwares and interactions. However, once you understood the basics you had quite a few opportunities. The goal of the project was to see “What if we could change the songs you were listening to depending on if you were running or not?”
In the end we managed to create a model and code that allowed us to have one person run or walk and have this translated into a song. This was fun to see happen because it could have implications in the way people interact with their own movement and what it could bring to the realm of dancing/running or other sports.
Best AI Website Creator