Overall I think the IM Show was interesting and fun. A lot of people came and interacted with my final project and some of them even started dancing. Even though I tried to make it full screen the system lagged a lot so I ended up using the command + to make it look like full size.
“We are like artists or inventors uncovering meaning in the patterns of movement and connection.”– Zach Lieberman
“Shadowing Presence” is an experiment that delves into the idea of becoming physically part of the digital experience. Using particle systems that interact with the human body through a webcam, the project transforms participants into both performers and elements within the system. The particles respond in real-time, attracted by the participants’ motion, creating an immersive dialogue between the digital and the physical.
The aim is to create a playful digital space where users can experience how their physical presence influences a digital environment. The project also serves as a metaphor for how humans shape and are shaped by the digital systems they interact with. Through this project, I go on a journey to uncover what it means to be particles in the digital world interacting with other particles through a screen. This project is inspired by Zach Lieberman’s work, which focuses on making digital interactive environments that invite participants to become performers, seamlessly blending technology and human connection.
Highlight and Progress:
The program analyzes the pixels taken through a webcam and turns them into a black-and-white particle system. Then, a second particle system detects motion via the webcam and responds by moving toward the detected motion.
Pages Design and Navigation:
Main Page: The main page has an image I designed to match the program’s functionality. I moved the customization buttons closer to the center so they are easier for users to see and use. These buttons are for changing the color and the size of the interactive particle ‘object’. Further, I added some instructions to make it easier for the user to understand the function of the project. Buttons help in the navigation between the two main pages of the program. The ‘Start’ Button moves the users to the interaction page after they pick the colors and size of the particles they want to interact with.
Experience Page (Page 1): This page is a real-time interactive environment where users can see themselves through a webcam. Here particles in the shape of a heart respond to the user’s motion while also reflecting their shadow in a particle-like form. There are two control buttons: the ‘reset’ and the ‘save’. The ‘reset’ goes back to the main page and users can change the color and size of particles, while the ‘save’ saves an image of the experience.
WebCam Particle Interactive Particle System:
The webcam is the primary input for capturing user motion and creating the particle interpretation of reality by analyzing the brightness of pixels. The brightness values of each pixel are calculated, and differences between the current frame and the previous frame are used to paint a sort of digital image of the user. For the motion, I process the video feed to detect movement and drive the particle system’s behavior. Users personalize their experience by customizing the color and size of particles, which then playfully respond to motion. Initially, the particles are randomly positioned on the canvas and when they detect motion they move toward it. Here, users see themselves as a mirrored reflection of particles and then interact with objects that respond to their motion, creating an immersive and interactive experience.
Progress:
I began designing the interface of the project and the basic interaction elements, which are the buttons. I had some bugs regarding when a button should appear and when it should disappear, so I had to reorganize the code to make it work better. As a result, I decided to make the size buttons () and the color buttons () an array, which made it easier for me to apply them to the particle system as the project progressed. I added more functions to handle the buttons for each page: startbutton() and resetbutton(). For the main page buttons, I added a function to create them and another to remove them, and then some buttons needed their functions, such as the save button. After that, I added the particle system, which is inspired by the ASCII text Image by The Coding Train. The particle systems are initially randomly placed and then move toward the positions. The particle’s color is based on brightness, and the size is between 2 and 7, mapped based on brightness so that the darker ones are smaller and the brighter ones are bigger. Now, in terms of how the particles are drawn. I initially load the video image pixels, then map the brightness of each pixel ( which takes four places in the index) from the RGB values, and then render it.
Additionally, I added another particle system that is interactive. The Interactive particles visually represent the interaction between the user and the program. The Particle’s movement is a direct response to user activity, like a feedback loop where the user influences the behavior of the particles in real-time. It does this by detecting the motion. When the video is captured, the pixel data is loaded in an array RGBA then by comparing the current frame pixel data with the previous one the motion is detected. The threshold is used to see if the brightness difference is big enough to consider its motion or not. I visually made this particle system look like hearts because I wanted it to be different from the particles that reflect the user in the digital world.
let currentPage = "main";
//make the color and size buttons and array so its easier to use and manege
let sizeButton = [];
let colorButton = [];
// for interactive particles
let selectedColor = "white";
let selectedSize = 5;
// interactive particle
let interactiveParticles = [];
// for motion
let previousFrame;
//navigation buttons switch btwn states
let startButton;
let resetButton;
//Save Button
let saveButton;
//image load
let Img1;
//particle class array
let particles = [];
//video
let video;
function preload() {
Img1 = loadImage("/image1.png");
}
function setup() {
createCanvas(1200, 800);
//functions that handle the buttons in each page
setUpMainButtons();
//for vid
video = createCapture(VIDEO);
video.size(80, 60);
//for particles, generate a grid like to correspond to pixcels from webcam for better visualization
for (let y = 0; y < video.height; y++) {
for (let x = 0; x < video.width; x++) {
// scale for display
particles.push(new Particle(x * 15, y * 15));
}
}
// for interactive particles. rray of particles random on canvas
for (let i = 0; i < 400; i++) {
interactiveParticles.push(
new InteractiveParticle(random(width), random(height))
);
}
}
function draw() {
background(0);
video.loadPixels();
// update canvas depending on current page
if (currentPage === "main") {
drawMainPage();
} else if (currentPage === "page1") {
drawPage1();
drawParticleSystem();
}
}
//function for main page buttons
function setUpMainButtons() {
//Start experiance button
startButton = createButton("Start");
startButton.size(150, 50);
startButton.style("font-size", "38px");
startButton.style("background-color", "rgb(250,100,190)");
startButton.position(width / 2 - 100, height / 2);
startButton.mousePressed(() => {
currentPage = "page1";
// call the remove function for main page to remove them
removeMainButtons();
//add the page 1 buttons function
setUpPage1Buttons();
});
// color buttons
colorButtons = [
createButton("Pinkish")
.style("background-color", "rgb(255,105,180)")
.mousePressed(() => (selectedColor = color(255, 105, 180))),
createButton("Blueish")
.style("background-color", "rgb(0,191,255)")
.mousePressed(() => (selectedColor = color(0, 191, 255))),
createButton("Greenish")
.style("background-color", "rgb(0,255,127)")
.mousePressed(() => (selectedColor = color(0, 255, 127))),
];
colorButtons[0].position(340, 300);
colorButtons[1].position(400, 300);
colorButtons[2].position(460, 300);
// size buttons
sizeButtons = [
createButton("Random")
.style("background-color", "rgb(205,165,200)")
.mousePressed(() => (selectedSize = random(2, 15))),
createButton("Large")
.style("background-color", "rgb(150,200,255)")
.mousePressed(() => (selectedSize = 15)),
createButton("Small")
.style("background-color", "rgb(100,150,227)")
.mousePressed(() => (selectedSize = 2)),
];
sizeButtons[0].position(610, 300);
sizeButtons[1].position(680, 300);
sizeButtons[2].position(730, 300);
}
// remove main page buttons
function removeMainButtons() {
startButton.remove();
for (let btn of colorButtons) btn.remove();
for (let btn of sizeButtons) btn.remove();
}
//function page 1 buttons
function setUpPage1Buttons() {
// save button
saveButton = createButton("Save Canvas");
saveButton.style("background-color", "rgb(100,150,227)").position(460, 10);
saveButton.mousePressed(saveCanvasImage);
// reset button
resetButton = createButton("Reset");
resetButton.style("background-color", "rgb(150,200,255)").position(590, 10);
resetButton.mousePressed(() => {
currentPage = "main";
// remove page 1 buttons
removePage1Buttons();
// add main page buttons instead
setUpMainButtons();
});
}
// remove page 1 buttons
function removePage1Buttons() {
saveButton.remove();
resetButton.remove();
}
// main page content
function drawMainPage() {
image(Img1, 0, 0, 1200, 800);
textFont("Courier New");
textSize(42);
fill(200, random(100, 250), 200);
text("Shadowing Presence", width / 2 - 260, height / 2 - 40);
// instruction text
textFont("Courier New");
textSize(16);
fill(255);
text("Welcome!", width / 2 - 50, height / 2 + 100);
text(
"Personalize your digital object by picking a color and ",
width / 2 - 260,
height / 2 + 120
);
text(
"a size. Enjoy the way these object interacto with your motion.",
width / 2 - 260,
height / 2 + 140
);
text("Press 'F' to toggle fullscreen", width / 2 - 260, height / 2 + 165);
}
// page 1 content
function drawPage1() {
// flip horizontally so that it feels like a mirror
translate(width, 0);
scale(-1, 1);
// process video for motion detection
video.loadPixels();
if (video.pixels.length > 0) {
if (!previousFrame) {
previousFrame = new Uint8Array(video.pixels);
}
let motionPixels = detectMotion(
video.pixels,
previousFrame,
video.width,
video.height
);
previousFrame = new Uint8Array(video.pixels);
// draw particles
for (let particle of interactiveParticles) {
particle.setColor(selectedColor);
particle.setSize(selectedSize);
particle.update(motionPixels, video.width, video.height);
particle.show();
}
}
}
// save image
function saveCanvasImage() {
saveCanvas("Image", "png");
}
// particle system
// to control the behavior and appearance of a particle system.
function drawParticleSystem() {
video.loadPixels();
// updating and drawing each particle based on the video feed.
for (let i = 0; i < particles.length; i++) {
// cal the x and y of the current particle by taking the modulus (x) and div (y) of the particle index and the video width and height. This way i can map the 1D particle array index to its 2D pixel position
const x = i % video.width;
const y = floor(i / video.width);
const pixelIndex = (x + y * video.width) * 4;
const r = video.pixels[pixelIndex + 0];
const g = video.pixels[pixelIndex + 1];
const b = video.pixels[pixelIndex + 2];
const brightness = (r + g + b) / 3;
particles[i].update(brightness);
particles[i].show();
}
}
function keyPressed() {
//press F for full screen
if (key === "F" || key === "f") {
// Check if 'F' is pressed
let fs = fullscreen();
fullscreen(!fs); // Toggle fullscreen mode
}
}
update(motionPixels, videoWidth, videoHeight) {
// detect motion and move towards it
let closestMotion = null;
let closestDist = Infinity;
//processes pixel data from a video feed to track motion and calculates the distance from position to nearest point of motion
for (let y = 0; y < videoHeight; y++) {
for (let x = 0; x < videoWidth; x++) {
let index = x + y * videoWidth;
if (motionPixels[index] > 0) {
let motionPos = createVector(
x * (width / videoWidth),
y * (height / videoHeight)
);
let distToMotion = p5.Vector.dist(this.pos, motionPos);
if (distToMotion < closestDist) {
closestDist = distToMotion;
closestMotion = motionPos;
}
}
}
}
// move towards closest motion
if (closestMotion) {
let dir = p5.Vector.sub(closestMotion, this.pos).normalize();
this.vel.add(dir).limit(12);
}
this.pos.add(this.vel);
// to slow dawn
this.vel.mult(0.8);
}
I did the user testing with 3 people. Through the feedback process I decided to change two main things: 1) the shape of the interactive particle to heart to make it more distinguishable, 2) the placement of the customization buttons so that they can see them and click them.
There is a lot of improvement that can go into the code mainly in the interactive particles. I think if I had time I would have added customization to how the object would look like a hear, a star, a smile face, a triangle, a number, etc. Additionally, I would make the interactive particle system have its own agency on it self by adding gravity to it, an attraction force between the particles, and maybe even make them attract to or steer away from the user depending on the nature of the motion detected.
This project is a journey of exploring the concept of digital presence. The idea of the project is being physically part of the work. To do this, I explore particle system interaction with the human body through a web camera where both the element of interaction and the body are made of particles. This project is inspired by Zach Lieberman’s work, which focuses on making digital interactive environments that invite participants to become performers.
Progress:
I initially began designing the interface of the project and the basic interaction elements, which are the buttons. I had some bugs regarding when a button should appear and when it should disappear, so I had to reorganize the code to make it work better. As a result, I decided to make the size buttons () and the color buttons () an array, which made it easier for me to apply them to the particle system as the project progressed. I added more functions to handle the buttons for each page: startbutton() and resetbutton(). For the main page buttons, I added a function to create them and another to remove them, and then some buttons needed their functions, such as the save button. After that, I added the particle system, which is inspired by the ASCII text Image by The Coding Train. The particle systems are initially randomly placed and then move toward the positions. The particle’s color is based on brightness, and the size is between 2 and 7, mapped based on brightness so that the darker ones are smaller and the brighter ones are bigger. Now, in terms of how the particles are drawn. I initially load the video image pixels, then map the brightness of each pixel ( which takes four places in the index) from the RGB values, and then render it.
// particle system
function drawParticleSystem() {
video.loadPixels();
for (let i = 0; i < particles.length; i++) {
const x = i % video.width;
const y = floor(i / video.width);
const pixelIndex = (x + y * video.width) * 4;
const r = video.pixels[pixelIndex + 0];
const g = video.pixels[pixelIndex + 1];
const b = video.pixels[pixelIndex + 2];
const brightness = (r + g + b) / 3;
particles[i].update(brightness);
particles[i].show();
}
}
// particle class
class Particle {
constructor(x, y) {
this.pos = createVector(random(width), random(height));
this.target = createVector(x, y);
this.vel = p5.Vector.random2D().mult(3);
//size and color of particles
this.size = 2;
this.color = color(255);
}
update(brightness) {
// https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Operators/Conditional_operator
// conditional (ternary) operator
// the brightness value is less than 130 = darker region, the particle's color is set to black. vise versa
this.color = brightness < 130 ? color(0) : color(255);
// Smooth return to target position
let dir = p5.Vector.sub(this.target, this.pos);
this.vel.add(dir.mult(0.09));
this.vel.limit(2);
this.pos.add(this.vel);
// Adjust particle size dynamically
this.size = map(brightness, 0, 255, 2, 7);
}
show() {
noStroke();
fill(this.color);
circle(this.pos.x, this.pos.y, this.size / 2);
}
}
Sketch:
Future Work:
There is still a lot of work to be done for the final project. I want (1) to work more on the interface of the project in terms of design. Additionally, I want (2) to add another particle system, which will be the second body that interacts with the body in the digital world. These particles will take the color and the size the user picks before starting the experience. These particles will be able to detect motion in the video and follow it.
In this project, I wanted to explore the dynamic between digital presence and absence. What does it mean to be in the digital world digitally? Does it mean you are physically there, or does it mean you are participating in it? This project explores the idea of being physically part of the work. To do this, I explore particle system interaction with the human body through a web camera. The particle system acts as a responsive object orbiting and/or avoiding the participant. I think for the whole semester I have been interested in the particle system and their behavior and wanted to do more with them. As a result, I am bringing it back and building on its concepts for this final project.
Interaction methodology:
The sketch has two stages: the initial stage is customizing the environment, and the second is experiencing it. I want the interface to be simple, so by clicking the buttons the audience can pick the colors of the particles: shades of red, shades of blue, or shades of green, also they can choose the size of the particles they want to interact with: random, small, or large. Then, they click start to begin the experience, where they can also click buttons to reset it or save an image.
Design of canvas interaction and buttons:
Base Sketch:
There is a lot to work on in terms of the interface and functionality of the whole experience. I need to figure out how to make the buttons in the initial state disappear, and the others will appear. I need to make the color function and make logic on how the particles would take the desired color and shape, and finally how to make the webcam work the way I want it to.
Throughout the semester, my coding skills have improved. Looking back at my first assignment and this one, I think there is a lot of improvement in making my ideas come to life. For this week’s project, I used a dimensional Cellular Automaton to create a fabric-like pattern. My project is an interactive cellular automaton inspired by the RGB Cellular Automaton project. The randomized ruleset and adjustable cell size allow users to visualize infinite visual possibilities. In this project, each layer of cells builds on the previous to create a unique pattern.
Highlight and Progress:
I began this journey by searching for inspiration because I felt there were a lot of constraints in the algorithmic design of the cellular automaton. I started experimenting with this project in hopes of creating a system where the color of the cells that are equivalent to one would change around the mouse; however, I faced so many issues, and the system would stop working all the time. I tried to fix it, but I had limited time to finish this assignment. As a result, I began experimenting with different shapes and sizes and even tried to create a new rule set each time the mouse was pressed.
Throughout the process, I began improving the project by adding layers to it, as you can see from the interface, I initially wanted it to feel like a website. I think what I like the most about the project is the visualization. I think it is interesting and engaging to see and experiment with. I am proud that I was able to tackle the ruleset to be able to change it. This is done by pressing the mouse. When the mouse is pressed, the for() loop goes through each index of the cells and sets the value to 0, resetting all cells. For each cell, the x and y positions are calculated based on trigonometric functions After resetting the grid, the center cell is set to 1, making it an active cell. This marks the starting point for the new pattern.
// start at generation 0
let generation = 0;
// cell size
let w = 3;
let slider;
//state
let isComplete = false;
// starting rule
let ruleset = [0, 1, 1, 0, 1, 0, 1, 1];
function setup() {
createCanvas(windowWidth, windowHeight);
background(255);
textFont("Arial");
// make slider to change cell size
slider = createSlider(2, 20, w);
slider.position(25, 25);
slider.style("width", "100px");
// save button
saveButton = createButton("Save Canvas");
saveButton.position(160, 5);
saveButton.mousePressed(saveCanvasImage);
// reset button
resetButton = createButton("Reset");
resetButton.position(290, 5); // Place it next to the save button
resetButton.mousePressed(resetCanvas);
//array of 0s and 1s
cells = new Array(floor(width / w));
for (let i = 0; i < cells.length; i++) {
cells[i] = 0;
}
cells[floor(cells.length / 2)] = 1;
}
function draw() {
//slider w value
w = slider.value();
fill(240);
stroke(0);
// for slider text
rect(25, 5, 104, 20,2);
fill(0);
stroke(255);
textSize(13);
textFont("Arial");
text("Adjust Cell Size:", 30, 20);
// for resitting ruleset text
fill(240);
stroke(0);
rect(385, 5, 240, 20,2);
fill(0);
stroke(255);
textSize(13);
textFont("Arial");
text("Press Mouse to Generate a new pattern ", 390, 20);
for (let i = 1; i < cells.length - 1; i++) {
//drawing the cells with a state of 1
if (cells[i] == 1) {
noFill();
stroke(random(180), random(180), random(250));
//y-position according to the generation
square(i * w, generation * w, random(w));
}
}
//next generation.
let nextgen = cells.slice();
for (let i = 1; i < cells.length - 1; i++) {
let left = cells[i - 1];
let me = cells[i];
let right = cells[i + 1];
nextgen[i] = rules(left, me, right);
}
cells = nextgen;
// gen + 1
generation++;
// stop when it reaches bottom of the canvas
if (generation * w > height) {
noLoop();
isComplete = true;
}
}
// a new state from the ruleset.
function rules(a, b, c) {
let s = "" + a + b + c;
let index = parseInt(s, 2);
return ruleset[7 - index];
}
// mouse is pressed
function mousePressed() {
// make new ruleset with random 0s and 1s
ruleset = [];
for (let i = 0; i < 8; i++) {
ruleset.push(floor(random(2))); // Random 0 or
// fill(255);
// square(i *w, generation * w, w);
}
//https://p5js.org/reference/p5/console/ for debugging
console.log("New Ruleset: ", ruleset);
// restart with the new ruleset
cells = new Array(floor(width / w));
for (let i = 0; i < cells.length; i++) {
cells[i] = 0;
noStroke();
let x1 = 5 * i * cos(i);
let y1 = i * sin(i);
fill(205, 220, random(90, 155), 10);
circle(x1, y1, w * random(150));
}
cells[floor(cells.length / 2)] = 1;
generation = 0;
loop();
}
// save image
function saveCanvasImage() {
saveCanvas("cellularAutomata", "png");
}
// restart
function resetCanvas() {
background(255);
cells = new Array(floor(width / w));
for (let i = 0; i < cells.length; i++) {
cells[i] = 0;
}
cells[floor(cells.length / 2)] = 1;
generation = 0;
loop(); //
}
Embedded Sketch:
Future Work:
To improve this project in the future, it would be interesting to see how with a web camera, audiences can interrupt the pattern by making noise. Additionally, adding a system to choose a set of colors and maybe a layout for the different patterns would also be interesting to visualize. Improving this project to become a website where people can experiment with the endless possibilities using Cellular Automaton in terms of shape, size, color, and layout would be interesting to see. Perhaps this project would evolve into an educational tool, teaching concepts behind Cellular Automaton and mathematical systems while reflecting the intersection of art, technology, and science in an interactive format.
Resources:
Wolfram Science and Stephen Wolfram’s “A New Kind of Science.”www.wolframscience.com/gallery-of-art/RGB-Elementary-Cellular-Automaton.html.
For this project, I got inspired by the nature of codebook examples. Especially the ones about attraction forces and movers. I decided to create a particle system that is affected by attraction and repulsion forces. Through this class, I realized I am obsessed with particles and small shapes working together to create different artworks. I imagined that the particles would slowly get attracted to an attractor but then be repulsed quickly to create a dynamic of contradiction.
Highlight and Process:
Matters.js library is very new to me. Because it is a physics-based engine, it has its own approach to defining bodies, forces, and interactions, as a result, it was a little different to me. I had to take some time to understand the syntax but more time to explore the possibilities it has. It is a very interesting library to delve into due to many limitations in terms of learning the library and its implementation. I think I did not have enough time to explore it enough to comfortably use its full potential in this assignment.
For the sketch, I had an attractor class and a mover class. The main functionality of the attractor class is to attract the movers into a circular body via forces. Within the attractor class, there is the set position function which later sets the attracto to a new position when the mouse is pressed. The mover class creates a bouncy object made of 4 circles that is affected by air resistance. Making the movers responsive to forces the way I was imagining it was challenging, as I had to play around with a lot of the parameters. In terms of interactivity, the attractor changes position to wherever the mouse is, and the movers follow. Additionally, I made a slider to increase or decrease the strength of the force toward the attractor.
// inspo http://natureofcode.com
const { Engine, Bodies, Body, Composite, Vector } = Matter;
// A Mover and an Attractor
let movers = [];
let attractor;
let engine;
function setup() {
createCanvas(700, 650);
// matter engine
engine = Engine.create();
engine.gravity = Vector.create(0, 0);
// slider to control gravitational constant as a way ti change the strength
gravitySlider = createSlider(0.001, 1, 0.05, 0.001);
gravitySlider.position(20, height - 40);
gravitySlider.style("width", "90px");
for (let i = 0; i < 800; i++) {
movers[i] = new Mover(random(width), random(height), random(0.01, 0.5));
}
attractor = new Attractor(width / 2, height / 2);
// attractor[2] = new Attractor(width /8 , height / 8, 50);
//attractor[3] = new Attractor(width , height , 50);
}
function draw() {
background(0, 60);
Engine.update(engine);
// show slider on canvas
// Display the current value of the slider
fill(255);
stroke(255);
textFont('Courier');
textSize(20);
text("Change strength " + gravitySlider.value().toFixed(5), 25, height - 60);
// if mouse is pressed, update the attractor position to follow the mouse
if (mouseIsPressed) {
attractor.setPosition(mouseX, mouseY);
}
// Apply the attractor force to all movers
for (let mover of movers) {
let force = attractor.attract(mover);
mover.applyForce(force);
mover.show();
}
//attractor.show();
}
Embedded Sketch:
Future Work:
In terms of this project, a lot can be improved, for instance, adding more attractions, making the movers more dynamic, and maybe reacting to one another. Additionally, adding more forces and constraints into the sketch. There is a lot to learn and discover about Mattaers.js and other libraries. I think exploring and experimenting more is the best way to learn.
Acresti, Louis. “Remove Items From Array While Iterating Over It. (Example).” Coderwall, 27 Sept. 2021, coderwall.com/p/prvrnw/remove-items-from-array-while-iterating-over-it.
For this project, I decided to make a visualization inspired by tension and release with an interactive element as the mouse hovers around the canvas. Initially, the flock is strictly constructed around an attractor in a circular shape, but over time, it changes and moves around the canvas. The mouse interaction enables the Boids to transform shape and size, adding layers to the sketch. The shape of the Boide is inspired by kite-like crystal shapes in the shade of light transparent blue symbolizing shattered glass. Resembling the sketch this way makes sense to me because it shows fragility and the Boids collective movement while remaining individual. This mirrors the sensitivity of how one Boids movement influences others.
Highlight of Code:
For this project, I reviewed the presentation and watched the videos to understand the concept of flocking. I wanted to bring into the sketch things we learned prior to the class, and I decided that having an attractor would be interesting. I initially started playing around with the code we did in class. I played a lot with different classes and ways to make it work. I added three main classes: an attractor, a Boid, and a flock. Even though I had so much debugging to do, I think I am close enough to the visualization I had in my head.
The attractor class pushed the Boids towards it through a force similar to orbiting. It was hard to figure out how to make it work, but as I played with the numbers, it started making sense. The orbiting force is simply rotating the force by 2. The Boid class is affected by flocking and attraction. This helps the body move in groups and respond to the attraction force. The methods used for the Boid class are froce, flock, attraction, update, borders, and the show functions.
for (let i = 0; i < 400; i++) {
//changing this part effects how the overall visulaization is
let angle = map(i, 0, 400, 0, TWO_PI);
let boid = new Boid(
width/1.7 + radious * cos(angle),
height/1.7 + radious * sin(angle)
);
flock.addBoid(boid);
}
The above code is the main effector of the visualization. It took me some time to create a visualization I though was interesting, and it was inspired by the midterm for this semester.
class Boid {
constructor(x, y) {
this.acceleration = createVector(0, 0);
this.velocity = createVector(random(-0.01, 0.001), random(-0.01, 0.01));
this.position = createVector(x, y);
this.r = 3;
//mass
this.mass = 2;
// Maximum speed
this.maxSpeed = 0.7;
// Maximum steering force
this.maxForce = 5;
}
applyForce(force) {
let f = p5.Vector.div(force, this.mass);
//changed how it moves looks nice
this.acceleration.sub(f);
this.acceleration.mult(10);
this.acceleration.mult(f);
}
run(boids, attractor) {
this.flock(boids);
this.applyAttraction(attractor);
this.update();
this.borders();
this.show();
}
applyAttraction(attractor) {
//pull boids
attractor.attract(this);
}
// accumulate acceleration each time based on three rules
flock(boids) {
let sep = this.separate(boids); // Separation
let ali = this.align(boids); // Alignment
let coh = this.cohere(boids); // Cohesion
// Arbitrarily weight these forces
sep.mult(1);
ali.mult(1);
coh.mult(1);
// Add the force vectors to acceleration
this.applyForce(sep);
this.applyForce(ali);
this.applyForce(coh);
}
// update location
update() {
// velocity
this.velocity.add(this.acceleration);
// limit speed
this.velocity.limit(this.maxSpeed);
this.position.add(this.velocity);
// reset accelertion to 0 each cycle
this.acceleration.mult(0);
}
// calculate and apply a steering force towards a target
// STEER = DESIRED MINUS VELOCITY
seek(target) {
// A vector pointing from the location to the target
let desired = p5.Vector.sub(target, this.position);
// Normalize desired and scale to maximum speed
desired.normalize();
desired.mult(this.maxSpeed);
// Steering = Desired minus Velocity
let steer = p5.Vector.sub(desired, this.velocity);
steer.limit(this.maxForce); // Limit to maximum steering force
return steer;
}
show() {
// draw a kite shape
let angle = this.velocity.heading();
fill(random(90, 127), random(150, 200), random(180, 255), 150);
stroke(random(100, 127), random(100, 200), random(200, 255));
push();
translate(this.position.x, this.position.y);
rotate(angle);
let freqX = map(mouseX, 0, width, 1, 15);
let freqY = map(mouseY, 0, height, 1, 15);
beginShape();
let x = this.r * cos(10 * this.r) * freqX;
let y = this.r * cos(10 * this.r) * freqY;
//crystal like shape
//right
vertex(x * 2, 0);
//top right
vertex(x * 0.7, -y);
//top left
vertex(-x * 0.7, -y);
//left
vertex(-x * 2, 0);
//bottom
vertex(0, y);
endShape(CLOSE);
pop();
}
// wraparound
borders() {
if (this.position.x < -this.r) this.position.x = width + this.r;
if (this.position.y < -this.r) this.position.y = height + this.r;
if (this.position.x > width + this.r) this.position.x = -this.r;
if (this.position.y > height + this.r) this.position.y = -this.r;
}
// separation itchecks for nearby boids and steers away
separate(boids) {
let desiredSeparation = 15;
let steer = createVector(0, 0);
let count = 0;
// for every boid in the system, check if it's too close
for (let i = 0; i < boids.length; i++) {
let d = p5.Vector.dist(this.position, boids[i].position);
// ff the distance is greater than 0 and less than an arbitrary amount (0 when you are yourself)
if (d > 0 && d < desiredSeparation) {
// calculate vector pointing away from neighbor
let diff = p5.Vector.sub(this.position, boids[i].position);
diff.normalize();
// weight by distance
diff.div(d);
steer.add(diff);
// keep track of how many
count++;
}
}
// average -- divide by how many
if (count > 0) {
steer.div(count);
}
// As long as the vector is greater than 0
if (steer.mag() > 0) {
// Implement Reynolds: Steering = Desired - Velocity
steer.normalize();
steer.mult(this.maxSpeed);
steer.sub(this.velocity);
steer.limit(this.maxForce);
}
return steer;
}
// Alignment
// For every nearby boid in the system, calculate the average velocity
align(boids) {
let neighborDistance = 40;
let sum = createVector(0, 0);
let count = 0;
for (let i = 0; i < boids.length; i++) {
let d = p5.Vector.dist(this.position, boids[i].position);
if (d > 0 && d < neighborDistance) {
sum.add(boids[i].velocity);
count++;
}
}
if (count > 0) {
sum.div(count);
sum.normalize();
sum.mult(this.maxSpeed);
let steer = p5.Vector.sub(sum, this.velocity);
steer.limit(this.maxForce);
return steer;
} else {
return createVector(0, 0);
}
}
// Cohesion
// For the average location (i.e. center) of all nearby boids, calculate steering vector towards that location
cohere(boids) {
let neighborDistance = 40;
let sum = createVector(0, 0); // Start with empty vector to accumulate all locations
let count = 0;
for (let i = 0; i < boids.length; i++) {
let d = p5.Vector.dist(this.position, boids[i].position);
if (d > 0 && d < neighborDistance) {
sum.add(boids.position); // Add location
count++;
}
}
if (count > 0) {
sum.div(count);
return this.seek(sum); // Steer towards the location
} else {
return createVector(0, 0);
}
}
}
The Boid class was the most complicated. I had to do a lot of debugging to add the attractor into action so that the Boids are affected by it.
Embedded Sketch:
Future work:
I am satisfied with how the sketch turned out to be. Things I want to experiment further with is how the flock would move around the canvas by creating a path for them to create different shapes and interactions that are strictly influenced by tension and release. Further, I think having an audio sync into the work is also interesting to see. The audio could perhaps be in relation to the acceleration and speed of the flock’s movement. Further, if this could translate into an interactive project, perhaps I can add a motion sensor so that the audience can influence the behavior of the flock through that data.
For this project, I wanted to explore a difficult reality in my life, reimagining it in a way that feels more approachable and symbolic. This past year and half I am living the mose painful realities of my life—the loss of my familys home and loved ones to the brutality of war. It’s a huge wound that I carry with me everyday, yet try to keep moving. Initially, I considered expressing this darkness without censorship using a visual of rockets aimed toward a target , with an explosive effect, but I decided on a subtler approach that relies on symbolism. To bring the concept together, I used warm, fiery colors to create a cohesive and evocative experience.
Highlights:
I began by reviewing the slides, reading the chapter, and watching the coding terrain videos to refresh my memory and better understand Autonomous Agents. I began by looking at the code we did in class and began to experiment with it. I decided to focus mainly on the seek and flee rather than pursue and evade because I thought they worked best with my idea.
I created an array of 3 targets equal in distance from one another in an angle and another array of 23 vehicles, keeping in mind the 0 index. I tried to figure out how to divide these vehicles towards their targets; I realized I could do this by first creating a target index and dividing ‘i’ by 8, the number of vehicles each target will get, and then passing the target with the target index. Further, I made the default of the program to seek a target until the mouse is pressed; it flees. I also added some transparency to the code, enhancing the flow of the points and the overall aesthetics. For the vehicle class, it was similar to the ones in class for flee and seek. I change them a little to make them work best for my visualization. For the flee functionality, I did not want the points to leave the canvas to fix this, so I created conditions to check the x and y edges of the canvas.
let vehicles = [];
let targets = [];
function setup() {
createCanvas(650, 650);
// Initialize 3 main targets and place them near each other with equal distance
targets.push(createVector(width * 0.25, height / 2.5));
targets.push(createVector(width * 0.5, height / 2));
targets.push(createVector(width * 0.75, height / 1.5));
// Create an array of vehicles that are grouped to each target
for (let i = 0; i < 23; i++) {
// Each 8 vehicles share the same target 0 to 17 its 18 so 8
let targetIndex = floor(i / 8);
//each vehicle has a random intial position and a target
let vehicle = new Vehicle(random(width), random(height), targets[targetIndex]);
vehicles.push(vehicle);
}
}
function draw() {
background(0,15);
for (let target of targets) {
//draw targets as a circle for now
//yellowish flamy like
fill(255,232,8);
stroke(255,154,0);
strokeWeight(2);
circle(target.x, target.y, 10);
}
// Update and show vehicles
for (let vehicle of vehicles) {
if (mouseIsPressed){
vehicle.flee(vehicle.target);
vehicle.update();
vehicle.show();
}else{
vehicle.seek(vehicle.target);
vehicle.update();
vehicle.show();
}
}
}
class Vehicle {
constructor(x, y, target) {
//pass x & y the intial positions of the vehicles
this.position = createVector(x, y);
this.velocity = createVector(0, 0);
this.acceleration = createVector(0, 0);
this.r = 10;
this.maxspeed = 4;
this.maxforce = 0.3;
//make a buffer to move away from target when reach
//this.buffer = 100;
// the 3 target the vehicles will seek
this.target = target;
// //change the size as an indecator if its going away or towards target
// this.size= 15;
}
update() {
this.velocity.add(this.acceleration);
this.velocity.limit(this.maxspeed);
this.position.add(this.velocity);
//check edges for flee x
if (this.position.x > width) {
this.position.x = 0;
} else if (this.position.x < 0) {
this.position.x = width;
}
//check edges for flee y
if (this.position.y > height) {
this.position.y = 0;
} else if (this.position.y < 0) {
this.position.y = height;
}
// reset acc
this.acceleration.mult(0);
}
applyForce(force) {
this.acceleration.add(force);
}
applyForce(force) {
//effects speed and directio
this.acceleration.add(force);
}
seek(target) {
let desired = p5.Vector.sub(target, this.position);
desired.setMag(this.maxspeed);
let steer = p5.Vector.sub(desired, this.velocity);
steer.limit(this.maxforce);
this.applyForce(steer);
}
flee(target) {
// Flee away when reaching buffer
let desired = p5.Vector.sub(this.position, target); // Reverse direction
desired.setMag(this.maxspeed);
let steer = p5.Vector.sub(desired, this.velocity);
steer.limit(this.maxforce);
this.applyForce(steer);
}
show() {
//dirc
let angle = this.velocity.heading();
fill(255,random(0,100),0);
stroke(255,random(0,100),random(0,8));
strokeWeight(5);
push();
translate(this.position.x, this.position.y);
rotate(angle);
beginShape();
point(-this.r*4 , -this.r*2);
point(this.r*4 , 0);
vertex(-this.r * 4, this.r*2);
endShape(CLOSE);
pop();
}
}
Sketch:
Future work:
I think there is a lot that can be done with this code, like adding more vocals and targets and experimenting with the different shapes and styles that can be implemented. Also, giving users more agency over the program where they can play around more and interrupt the decisions of the system. I also think it would be nice if I could try it out with the pursue and evade where I can make a buffer area to either evade or pursue the target.
Resources:
The Coding Train. “5.2 Seeking A Target – the Nature of Code.” YouTube, 16 June 2021, www.youtube.com/watch?v=p1Ws1ZhG36g.
Listening to the Artists talk about MUJO gave me another perspective on how to see their work. The artist talk was helpful in two ways 1) it got me thinking of the work process from a professional perspective, not just projects for classes. It gave me an insight into how the workflow goes and how exciting it can be. 2) The talk opened my eyes to the artwork itself, the meaning behind each part, and how it all comes together to support and push a concept, it is like each part harmonizes to drive the concept forward. Additionally, the installation got me thinking of how concepts such as assembling and disassembling can be symbolically expressed through dance, sound, the dessert, and the sand.
Further, the curation of the project to me was interesting because it was site specific and had different ideas that both work well towards the concept. I think, often artists have to make decisions to change some parts of how the artwork is presented to adapt to the atmosphere and environment surrounding it.
For this project, I wanted to use what we learned in data visualization and particle systems to visualize Palestinian folklore. I want to visualize sound waves through the lens of Palestinian embroidery.
The song I picked has an Arabic version and an English Remix. Thus, Palestinian folklore is known to be one of the vested tools to record events and phenomena. For the Palestinians, historically, it is women who sing these songs; they are a mirror of history and society and a contribution to crystallizing the higher goals of Palestinian society. For this project, I wanted to use Tarweeda (a type of Palestinian folklore), which is slow singing governed by a specific rhythm that depends primarily on recitation, a melodious recitation that carries its own music. This type of music became vibrant in Palestine during the Ottoman occupation, where women would sing these songs after encoding the words and delivering secret messages to each other and to the resistance fighters. The encryption process happens by adding ‘lam” in the words at the end or final syllables. Through this lens, I want to embed the traditional visual language of Palestinian embroidery with Palestinian Tarweed in an attempt to visualize the rhythms.
For the visualization aspect of the project, I looked into the Palestinian embroidery, which is usually a geometric pattern of shapes, flowers, and lines. The most famous embroidery colors are red threads on black fabric.
Code and Process:
I have been inspired to work with sound waves for a while. I have been watching videos of how it works and how sound wave data can be visualized. Initially, it seemed very complex to me; however, when I watched more and more videos, I got the gist of it. I had to think of a couple of things first before I took this step because I was not certain I could do it. As a result, I decided to see if P5js can visualize data. I found p5.FFT, which is Fast Fourier Transform, is an analysis algorithm that can isolate audio frequencies within a waveform, which returns an array of the analyzed data between -1 and 1. Its waveform() represents the amplitude value at a specific time. The analyze() computes the amplitude along the frequency domain from low to high between 0 to 255, and the energy () measures the amplitude at specific frequencies or ranges.
I began by preloading a small part of the song into P5js; I tested the P5js given example before doing my own iteration because the P5js sketch was confusing. I then created a variable that takes the p5.FFT data, and a wave data variable to take the waveform. Then, in an array of index numbers, I mapped the wave data for visual purposes and visualized it within a shape similar to what we did in class. When this part was working, I decided to add a Particle system class that is responsive to the sound. To do this, I analyzed the FFT data and read the amplitude of the data to see the numbers I needed to put into the energy to enable it to respond properly. This part took some time because I had to read a little more about the sound wave data and how to make it responsive.
After everything was working fine, I began experimenting with the shape and how I wanted it to look. My goal was to make it look similar to the embroidery patterns, which are usually floral. I realized that I needed to take the Sine value of i for both the x and the y; I was initially doing Cos(i) for x and Sin(i) for y. Then, I added interaction by making frequency values for x and y, mapping them, and then adding them into the x and y formula for the shape.
Highlight:
let song;
let currentPlaySong = 1;
let song1;
let hasPlayed = false;
let hasPlayed1 = false;
let fft; //this is to generate the wave form Fast Fourier Transform
let r1;
//For Printing https://gokcetaskan.com/artofcode/high-quality-export
// acording to this the size of my full screen sketch is good for high quality but the particles are not printing trying to fix this i will add them in a buffer actually realized its becuse the save condition was before the particle one
particles = [];
//button to start sketch
let button;
let started = false;
function preload() {
song = loadSound("/sound.mp3");
song1 = loadSound("/sound1.mp3");
}
function setup() {
createCanvas(windowWidth, windowHeight);
// createCanvas(windowWidth, windowHeight, SVG);
angleMode(DEGREES);
fft = new p5.FFT();
// button
button = createButton("Start");
button.position(width / 2, 530);
button.mousePressed(startSketch);
}
function draw() {
background(0);
//added this before translate to position correctly
if (!started) {
text("1-move the curser to change the shape.", width / 2 - 130, 450);
text(
"2-Press A or a to listen to the Arabic version",
width / 2 - 130,
470
);
text("3-Press E or e to listen to the English Remix", width / 2 - 130, 490);
text("4- Press R or r to Reset & S or s to Save", width / 2 - 130, 510);
fill(255);
textSize(15);
}
translate(width / 2, height / 2);
//particles respond to frequencies
fft.analyze();
amp = fft.getEnergy(20, [200]);
// shape
let waveData = fft.waveform();
if (started) {
for (let t = -1; t <= 1; t += 0.09) {
let freqX = map(mouseX, 0, width, 1, 10);
let freqY = map(mouseY, 0, height, 1, 10);
let offSet = map(mouseX, 0, width, 0, 150);
beginShape();
for (let i = 0; i < width; i++) {
stroke(150, 0, 0);
fill(0);
strokeWeight(1);
let index = floor(map(i, 0, 180, 0, waveData.length - 1));
r1 = map(waveData[index], -1, 1, 100, 300);
let x1 = r1 * sin(15 * i) * t * cos(i * freqX);
let y1 = r1 * sin(15 * i) * sin(i * freqY);
vertex(x1, y1);
}
endShape(CLOSE);
// if (key == 's'){
// save("mySVG.svg");
// noLoop(); }
// }
}
let p = new particle();
particles.push(p);
for (let i = particles.length - 10; i >= 0; i--) {
if (!particles[i].edges()) {
particles[i].update(amp > 190);
particles[i].show();
} else {
particles.splice(i, 0);
}
}
}
}
//function to switch from arabic to english version of the song and reset
function keyPressed() {
if (key === "a" || key === "A") {
switchToSong1(); // play song 1 when 'a' is pressed
} else if (key === "e" || key === "E") {
switchToSong2(); // play song 2 when 'e' is pressed
} else if (key === "r" || key === "r") {
resetSongs(); // reset the songs when 'R' is pressed
} else if (key === "s" || key === "S") {
// Save the canvas as PNG without stopping the sound
save("myPNG.png");
}
}
//I need to add this because if i dont for some reason when I press a it does not play
function switchToSong1() {
if (!song.isPlaying()) {
song1.pause(); // pause song 2 if it's playing
song.play(); // play song 1
currentSong = 1;
}
}
function switchToSong2() {
if (!song1.isPlaying()) {
song.pause(); // pause song 1 if it's playing
song1.play(); // play song 2
currentSong = 2;
}
}
function resetSongs() {
// stop both songs
song.stop();
song1.stop();
// Reset to the first song
currentSong = 1;
}
function startSketch() {
started = true;
button.hide(); // hide the button after starting the sketch
}
In this project, I had a couple of challenges. Initially, the shape I created was reflected in particles because this is how waveforms are printed out. For some reason, even when I drew lines or other shapes to map, it did not look right. I kept experimenting with the numbers within the intended shape until I figured it out. Another challenge was saving an SVG file and a PNG file. Whenever I try to save a file, the whole system lags. According to some sources linked below, it might have been because a) it was saved before the whole sketch was drawn, which made the particles disappear, and b) having the no-loop was causing issues. Even when I had a function for saving, the sound stopped playing; as a result, I added the saving part to the KeyPressed function, which magically solved the problem.
I am mostly proud that I was able to add other versions of the song into the system and allow the users to switch with songs, reset the whole system, and start again. I did this using conditions and pressing keys instead of the cursor interaction. Further, I create a start condition (state) button designed in CSS to start sketch.
Pen plotting:
As mentioned above, I had challenges saving SVG files. The p5js version I was using was the most compatible with sound, and despite my attempts to integrate the SVG into the file, it kept lagging. Apparently, sound is computationally heavy on the system, and adding SVG was not working, but I managed to get some images and then commented on the code. Another issue was that when pen plotting, the pen would pass through a specific point so many times that the center got a little damaged. I think it adds to the aesthetics of the design. I tried pen plotting twice. The first one was a disaster, but the second was good, considering that it was a new system for me.
I am satisfied with how my project turned out. However, I think it is always good to keep experimenting and trying new things. For future improvements, I would add more user interactions where users can play with the colors, more complex shapes, and even the particle system. I would also like to create an array of songs that can be played individually or together to create a chaotic experience.
A3 print:
Resources:
“Creating a Sound Wave Using P5.js.” Stack Overflow, stackoverflow.com/questions/55943459/creating-a-sound-wave-using-p5-js.
Colorful Coding. “Code an Audio Visualizer in P5js (From Scratch) | Coding Project #17.” YouTube, 28 Feb. 2021, www.youtube.com/watch?v=uk96O7N1Yo0.
Interactive Encyclopedia of the Palestine Question – Palquest, www.palquest.org/en/highlight/14497/palestinian-embroidery.
Kazuki Umeda. “Beautiful Sound Visualization Using Polar Coordinates.” YouTube, 3 June 2021, www.youtube.com/watch?v=CY5aGEXsGDo.
Moussa, Ahmad. “Working With SVGs in P5JS.” Gorilla Sun, 3 May 2023, www.gorillasun.de/blog/working-with-svgs-in-p5js.
p5.FFT. p5js.org/reference/p5.sound/p5.FFT.
“Palestinian Embroidery.” Interactive Encyclopedia of the Palestine Question – Palquest, www.palquest.org/en/highlight/14497/palestinian-embroidery.
العربية, مجلة الموسيقى. الترويدة الفلسطينية &Quot;المولالاة&Quot; تراث عاد للظهور. 31 Dec. 2023, www.arabmusicmagazine.org/item/1538-2023-12-31-10-16-38.
https://www.w3schools.com/css/ css comLab class
Taskan, Gokce. Art of Code – Gokce Taskan. gokcetaskan.com/artofcode/high-quality-export.