I started to use the Clmtrackr for my Synth assignment. But at that time, I just changed the bpm and envelope of the sequence and mapped it to the emotion category. After I learned that actually different melodies can make different feelings, I wanted to play around with emotions further.
For this project, I wanted to map user expressions with related melody. For example, when user is smiling, a happy melody will be played.
I used Clmtrackr and its emotion detection library to get the user expressions using confidence rate. For sounds, I used Tone.js to create each sequence and give some effects to the sounds (reverb, delay, and vibrato).
The way it works is only one sequence will be played at a time. I compared each confidence rate for each emotion, and picked the highest value to play the related sequence. Once another emotion gets higher confidence rate value, then the next sequence will be played.
Click here for the code on my Github.
For someone who doesn’t have background in music, I definitely learned a lot in this class. It was really challenging for me to make something out of my comfort zone. In this project, I tried to make the melody on my own. But I realized that the more I learned about it, the more I knew that there are always another approach, in terms of interaction, to design how people interact with music. And I guess that’s what I want to focus more on in the future.