Concept
I had a curious inspiration while I was going in a bus to campus. I was staring at the window, looking silently at the night, and this silent appreciation was then followed by the next music suggestion in my music app: DIIV – Taker.
It was a weirdly melancholic song, the type of beat I like: melancholic, but not to excessive levels to induce sadness, but rather, a peace calmness. And this is the type of feeling I got from listening to this music, calmness. Thus, for this week assignment, I wanted to share what I felt with a music visualization using flock systems.
Embedded sketch
Note: If the p5.js version is slow, it is suggested to try either this version or make a copy of the GitHub’s repo and run it locally on the computer.
Make sure to click on the Canva to start with the visualization!
Full-screen: Full-screen version
Brief explanation of the code
The flock system is divided on multiple parts, which are activated according to the time of the song; it is easily done with the value returned from the function song.currentTime()
. Here is a quick example:
Let’s say that I want to generate the wave effect that appears at the beginning, for this, I create the following parameters inside a custom function called startPart(value)
:
The wave code is located in the second if condition (value == 2).
function startPart(value) { if (value == 1) { for (let boid of flock) { //Reset value. boid.velocity.mult(0); boid.maxForce = 0.0; //Avoid boids from going out from the borders of the screen. boid.transparency = 1; boid.acceleration.add(random(-2, 2), random(-0.5, 0.5)); boid.update_values(1); //Updates the boundaries to the first phase values. part = 1; } } //THIS IS THE WAVE CODE if (value == 2) { for (let boid of flock) { //Reset value. boid.velocity.mult(0); //Assign maxForce boid.maxForce = 0.01; //Avoid boids from going out from the borders of the screen. boid.transparency = 10; boid.acceleration.add(random(1, 3), random(-1, 1)); boid.update_values(2); //Updates the boundaries to the first phase values. part = 2; } }
And then, the specific part is called with the following custom function checkAudioTime()
on the second 14 of the song:
function checkAudioTime() { if (round(audio.currentTime(), 0) == 6) { startPart(1); } if (round(audio.currentTime(), 0) == 14) { startPart(2); }
It is important to note that checkAudioTime()
is called continuously under draw()
.
Highlight of some code that I am particularly proud of
I am mostly proud of the patterns I created for this visualization. In total, there are 11 variations which are divided between two flocking systems. I will not paste all the variations here, since it is almost 300 lines of code, but I will share three examples of it:
if (value == 9) { for (let boid of flock_outside) { //Reset value. boid.velocity.mult(0); //Assign maxForce boid.maxForce = 0.01; boid.maxSpeed = 1; boid.perception_radius = 50; //Avoid boids from going out from the borders of the screen. boid.transparency = 50; boid.acceleration.add(random(-1, 1), random(-1, 1)); boid.update_values(6); //Updates the boundaries to the first phase values. part = 3; } } if (value == 10) { for (let boid of flock_outside) { //Reset value. boid.velocity.mult(0); boid.maxForce = 1; boid.transparency = 5; boid.acceleration.add(random(0), random(10)); } for (let boid of flock) { //Reset value. boid.velocity.mult(0); boid.maxForce = 1; boid.transparency = 5; boid.acceleration.add(random(0), random(10)); } } if (value == 11) { for (let boid of flock_outside) { //Reset value. boid.update_values(7); } for (let boid of flock) { //Reset value. boid.update_values(7); } part = 6; }
Reflection and ideas for future work or improvements
I am mostly proud of this work. It is by far the assignment I have spent the most time on (or probably the second one). It was an enjoyable challenge, which I felt really curious about what kind of results I could get with the flocking systems. Although, I felt that some patterns in the visualization are rather simplistic and should have been reacting to the sound.
Not to imply that I did not try to implement more reactions that are based on the frequencies of the song, but after some attempts, it was best to disregard the idea since it would consume a lot of time.
Used resources
17.11: Sound Visualization: Frequency Analysis with FFT – p5.js Sound Tutorial