Saeed Lootah – Final Project

Project Overview

This project explores a hand-tracked 3D interactive environment where a topology grid and boids react to user gestures.
The core concept combines three ideas I enjoyed most during the course: simple rule-based systems, satisfying emergent movement, and 3D simulation.
The final direction uses open-palm and clenched-fist gestures to attract/repel boids while the hand also influences a dynamic topology surface.

The design goal was to keep the visual language minimal and angular: mesh-line hands, a dot-based topology map, and color-coded boid states.
Instead of using gestures to rotate the world, gestures directly affect behavior inside the world.

Inspiration

I was initially unsure where to start, so I researched interactive installations and reflected on previous class work.
The project direction became clearer after revisiting hand recognition from our ml5.js session and combining it with topology and boids.

Pulse Topology by Atelier Lozano-Hemmer inspired the topology-grid idea.

Pulse Topology reference

Pulse Island I by the same artist inspired a calmer movement pacing.

Pulse Island I reference

Gesture examples from the Hand-Tracked Particle Simulator helped frame interaction possibilities, but also clarified what not to do
(gesture as pure camera/object rotation felt less natural for this project).

Hand tracked particle simulator reference 1
Hand tracked particle simulator reference 2

Seeing virtual hands in 3D (similar to headset experiences) became an important target for immersion.

Meta Quest hand reference

 

Video Documentation

Final interaction and visual output demo:

Github Repository:
https://github.com/ssl9619/Decoding-Nature-Hand-Gesture-Final-Project/

Implementation Details

Milestone 1: ml5.js hand tracking in 3D (flat result)

Milestone 1 ml5 result

First attempt used ml5.js with depth estimation to push landmarks in z-space.
While it partially worked for forward/back movement, rotation and depth consistency were limited.

Milestone 2: Switching to MediaPipe Hand Landmarker

Milestone 2 MediaPipe result

I moved to MediaPipe Hand Landmarker for more reliable xyz output. This enabled more convincing 3D hand behavior while retaining similar landmark structure.

Milestone 3: Topology grid and shader optimization

Milestone 3 topology map

The topology points are mapped to simplex noise in shader space. Height is mapped from noise values (-1 to 1), and brightness changes with height.
A shader implementation was required to resolve performance issues from CPU-heavy updates.

Milestone 4: Combining hand interaction and topology response

Milestone 4 integration result

After integration, hand positions created indentation-like effects in the topology map (similar to pin-screen behavior).

 

Milestone 5: Adding boids interaction logic

Boids were integrated as a behavior system driven by gesture state and local interactions.
Final tuning focused on attract/repel responsiveness and state transitions.

 

Milestone 6: Hand Gesture recognition

For this milestone I focused on mapping hand poses into reliable interaction modes.
An open palm triggers attraction/follow behavior, while a clenched fist triggers repel/escape behavior.
The main challenge was keeping gesture classification stable frame-to-frame so boids did not rapidly flicker between states.

 

Milestone 7: Creating a design for the hand

I tried several visual designs for the hand and ultimately returned to a style close to the original skeleton, but built from 3D forms.

First, I attempted an outline-style hand, but I was not satisfied with the overall look.

Second, I tried ellipses with a billboard approach, but visibility and depth readability were weak.

Finally, I switched to spheres and cylinders with mesh lines. It is not perfect, but it gave the best balance of clarity and style among the three approaches, and it remains an area for future improvement.

Boids Behavior and State Machine

  • Wander (yellow): default movement state.
  • Follow (blue): triggered by open palm when boids are within interaction distance.
  • Escape (red): triggered by clenched fist; boids return to wander after enough distance.
  • Contagion behavior: wandering can spread to nearby boids after prolonged following.
  • Close-range behavior: boids can orbit around the hand during attraction.

 

Hand Recognition Pipeline

The project started with ml5.js but switched to MediaPipe for stronger 3D landmark behavior.
AI assistance was used for migration and integration details.

Topology Grid

  • Simplex noise drives surface displacement over time.
  • Point brightness changes according to displacement height.
  • Shader implementation improved runtime performance significantly.
  • Grid can be circular by default, with a boolean option for rectangular mode.

3D Rendering and Navigation

  • Rendering uses WEBGL canvas.
  • orbitControl() allows scene inspection and confirms true 3D behavior.

 

Code Highlights

Highlight 1: Orientation Gizmo Drawing

function drawOrientationGizmo(center, orientation) {
  const right = p5.Vector.mult(orientation.right, GIZMO_SCALE);
  const up = p5.Vector.mult(orientation.up, GIZMO_SCALE);
  const normal = p5.Vector.mult(orientation.normal, GIZMO_SCALE);

  // For some reason (unknown to man) when commenting this out the topology grid stops working
  // so instead I just made strokeWeight(0) and wont ask any more questions
  strokeWeight(0);
  stroke(255, 90, 90);
  line(center.x, center.y, center.z, center.x + right.x, center.y + right.y, center.z + right.z);
  stroke(90, 255, 120);
  line(center.x, center.y, center.z, center.x + up.x, center.y + up.y, center.z + up.z);
  stroke(90, 150, 255);
  line(center.x, center.y, center.z, center.x + normal.x, center.y + normal.y, center.z + normal.z);
}

This is a highlight for all the wrong reasons. So really, its more of a lowlight. When changing from ml5.js to MediaPipe Hand Landmarker I relied heavily on AI to switch for me given it was my first time. In doing so it created this function. All it does is shows 3 lines perpendicular to each other with colors for each axis x,y,z and its at the center of the hand. Why? Not sure, but I tried removing the function from the update loop and suddenly the topology grid would stop working. Weird I thought, I’ll try just commenting out the lines relating to drawing the lines but I had the same issue. I realized at this point that perhaps its due to the topology grid being rendered using a shader and I didn’t really understand how that worked. I thought to myself, I don’t know exaclty why it doesn’t work, and I don’t want to find out why, because I don’t have much time or patience to do so. Lazily, I set the strokeWeight to zero and moved on.

Highlight 2: updateState() in boids3d.js

updateState(connectedWanderCount, mode, target) {
  const config = this.config;
  const stateConfig = config.state;
  const distToTarget = p5.Vector.dist(this.pos, target);

  if (this.wanderCooldown > 0) {
    this.wanderCooldown -= 1;
  }

  if (mode === "repel") {
    if (this.state !== "escape" && distToTarget < stateConfig.escapeTriggerDistance) {
      this.enterEscapeState();
    }
    if (this.state === "escape") {
      if (distToTarget >= stateConfig.escapeToWanderDistance) {
        this.enterWanderState();
      }
      return;
    }
  } else if (this.state === "escape") {
    this.enterFollowState();
  }

  if (this.state === "wander") {
    this.stateTimer += 1;
    this.wanderDurationLeft -= 1;
    if (this.wanderDurationLeft <= 0) {
      this.enterFollowState();
    }
    return;
  }

  this.stateTimer += 1;
  if (this.stateTimer < stateConfig.followBeforeWanderFrames || this.wanderCooldown > 0) {
    return;
  }

  const chance =
    stateConfig.baseWanderChancePerFrame +
    connectedWanderCount * stateConfig.neighborWanderChanceBonus;
  const cappedChance = min(chance, stateConfig.maxWanderChancePerFrame);
  if (random() < cappedChance) {
    this.enterWanderState();
  }
}

The boids and the hands are perhaps my favourite part of the simulation and in large part due to their interaction with each other. Central to their interaction is this state machine which determines whether a boid should wander, escape, or follow.

Reflection

I am very happy with the final result and how the interaction reads visually.
The topology-grid performance issue was resolved through shader-based optimization, while boids still have unresolved performance costs in denser scenarios.

One known bug remains: if the simulation starts while the palm is already open, boid state updates can behave unexpectedly until performing a clench-then-open reset.

References

AI Disclosure: AI assistance was used for shader-based topology optimization and migration from ml5.js hand tracking to MediaPipe Hand Landmarker integration.

Final Project – Progress Update

Introduction

I was without any ideas for a while. I originally thought about doing something that would replicate something around me in nature, then I thought about the previous features that I enjoyed most such as: Conway’s Game of Life, because of the simplicity; Boids, because I found the movement satisfying; and creating 3D simulations. The problem was I didn’t know where to start so I began searching online for other interactive installations. Perhaps I might find something I liked such as other installations in teamLab or some of the previous installations we viewed in class.

Then when I began to think about the features more deeply I remembered a class we did learning ml5.js to create a sketch with hand recognition. I immediately knew I wanted to use that with whatever I would end up creating.

Inspiration

During my search I found an installation called Pulse Topology by artist Atelier Lozano-Hemmer. It’s a large array of pulsing LED lightbulbs hanging from the ceiling, each with a different height, such that it resembles a noise map. From some of the videos he showed I did not like the rate at which they pulsed; I felt it was too aggressive when visually with the curves it looks like it should be calmer.

Made by the same artist was Pulse Island I (I had no idea this was made by the same person until I began searching for an image to write this). I liked the rate of pulsing from that installation more.

In my search for hand gesture recognition examples I stumbled upon this GitHub repo. It wasn’t made in p5; instead it was made using Three.js and something else. It used gesture recognition to rotate objects, zoom in and out, and cycle between different variations.

Seeing gesture recognition used to rotate an object reminded me of this video from SpaceX from 2013 showing gesture-based design. However, after seeing this video and interacting with the GitHub repo I saw before, I decided I wouldn’t use gestures as just another way to rotate an object because it felt unnatural, ironically. This was because I, and I imagine most people, are so used to using a keyboard or mouse/trackpad to rotate something in 3D that using anything else felt slower and less precise.

At the end I decided I liked the topology installation and wanted something like it in my sketch, but rather than lights from the ceiling I wanted it to be more minimalistic with dots, and have them be at ground level. Rather than pulsing, they would move up and down and their brightness would change accordingly. Secondly, gesture recognition is nice at first, but an effect I found much more exciting was to see your hands in 3D space in the sketch. There are a couple of VR headsets such as the Meta Quest 3 which have gesture recognition where you can see your hands in a virtual simulation, and I wanted to get as close to that as I could using the webcam, hoping that maybe the ml5.js library could be enough.

Meta Quest 3 Hands

Sketch

Watch the sketch demo on YouTube

I couldn’t get the embedded sketch to work properly because of a bug when putting it into the web editor.

Milestones

Milestone 1: Programming hands with ml5.js in a 3D space, but the result was flat.

I tried to use the depth estimation method in ml5.js to have the hands move back and forth. This approached the result I wanted, but it couldn’t handle rotation of the hand.

Needed to find an alternative library that could handle this. Needed help from AI to program it since it was my first time, but finally found something that worked.

Milestone 2: MediaPipe Hand Landmarker

Its hard to tell from an image alone but with orbitcontrol() you’ll be able to see that it does move back and forth.

Milestone 3: Topology Map

Found that combined there were performance issues. Realized it was because rendering and calculations were done on the CPU, but if I could use a shader it would perform better.

Got help from AI to create a shader that would achieve the same effect without affecting performance.

Milestone 4: Bringing both together

Having the hands cause indentations in the topology, like those pin screen toys. Again, from an image alone its hard to tell.

Pending Features

So far I’ve completed what seems like most of the technical implementation, but there is one more feature I want to add: boids. I want it so that with a clenched fist the boids will be repelled, but if you open your hand and extend your fingers some will be attracted and will go between your fingers and maybe twirl around them.

I will also need to change the hand from a skeleton to a mesh or some kind of model more resembling a hand, like the ones seen when using the VR headset shown earlier. Something semi-transparent, but I want the lighting from the topology at the bottom to have an effect on the hand.

Reflection

I’m happy with how things have turned out but there’s still much more work to be done. I did need to use AI for a couple of things: optimizing the topology grid and switching from using ml5.js to using MediaPipe Hand Landmarker, since it could output an estimation of the hand in the xyz coordinate system whereas ml5.js could only do x and y.

Assignment 11 – Saeed Lootah

Concept

I’ve always liked Conway’s Game of Life as a way to represent real cellular life using simple rules. But whenever I looked at the end result, I couldn’t fully see the resemblance to living systems. There was something about it that felt off, even if I couldn’t explain it at first.

After thinking about it more, I realized what bothered me about the classic aesthetic:

  • The pixelated style felt too rigid and angular to resemble the organic curves we see in real life.
  • Motion was hard to perceive; it felt more like watching stop-motion frames.
  • The black-and-white palette didn’t offer enough visual variety.

At the same time, I didn’t want to overcomplicate my version. Like the original, I wanted to keep the code simple while addressing these issues.

Code Highlight

push();
        noStroke();
        colorMode(HSL);
        let saturation = map(count2, 0, 24, 0, 150);
        let lightness = map(count2, 0, 24, 50, 100);
        
        // Map the (x, y) position to a hue gradient from top-left (0,0) to bottom-right (cols-1, rows-1)
        // Project (x, y) onto the main diagonal from (0,0) to (cols-1, rows-1)
        let diagonalProgress = ((x / (cols - 1)) + (y / (rows - 1))) / 2;
        let hue = map(diagonalProgress, 0, 1, 0, 360);
        fill(hue, saturation, lightness);
        let coefficient = expoMap(count, 0, 24, 0.1, 4.5);
        rectMode(CENTER)
        rect(x * cellSize, y * cellSize, cellSize * coefficient, cellSize * coefficient);
pop();

It’s probably the simplist piece of code in the sketch but I find it to be the most important. The reason being the map functions. I explain later in the milestones.

Embedded Sketch

 

Milestones

Stage 1 – Conway’s Game of Life

I first rebuilt Conway’s Game of Life in p5.js. The main difference was making it responsive to the canvas instead of using a fixed-size grid. I also added randomization: pressing R randomizes the grid, and the sketch starts with a randomized state.

Stage 2 – Experimenting with Opacity

 

I quickly found that opacity was an easy and effective way to show motion. By adjusting background opacity over time, movement became visible through trails. During testing, I found a “walker,” and this screenshot made it clear: without opacity, you probably wouldn’t notice that it was moving at all.

Stage 3 – Experimenting with Saturation and Lightness (using colorMode(HSL))

After exploring opacity, I tried a heatmap-like idea: cells with more neighbors become more saturated. I started with immediate neighbors (a 3×3 region), then expanded to a 5×5 neighborhood. I also tested different hues, but saturation alone still didn’t give me the look I wanted.

Stage 4 – Dynamic Cell Size

This was actually one of my earliest ideas, but I initially overcomplicated it in my head. My first concept was that cells with more neighbors should not only get larger, but “bulge,” almost like pushing up on paper from underneath a grid cell.

I didn’t know how to implement that cleanly, so I simplified. Instead, I used an exponential mapping function (rather than p5’s linear map() for a more natural effect), considered neighbors in a 5×5 area, and made low-neighbor cells smaller than their grid square.

That small detail made a huge difference. It defined the edges of each “organism” more clearly, and it reduced visual emphasis on small, self-sustaining shapes (like plus signs or small ovals), while giving larger moving organisms more presence. It’s difficult to fully capture in words – the images, and especially the live sketch, tell the story best.

Reflection

There are some things I would do differently or try out. I want to see what the bulging effect might look like. Also, I want to try adding a function where, if there doesn’t appear to be enough movement, the grid would randomize and start all over again.

I used AI to create the exponential map function. I also wanted to make hue change along both the x and y axes and only knew how to do it one way, so I got help from AI.

Saeed Lootah – Assignment 10

Concept

My concept is inspired by the red explosive barrels from the Half-Life video game series. I wanted to recreate that same mechanic of an exploding barrel using matter.js where objects can collide with a barrel, build up damage, and trigger an explosion. The idea of explosive barrels spread into other video games with objects that explode either on impact or as it builds up damage and growing up I always enjoyed playing games with destructible elements.

Code Highlight

explode(allBoxes) {
    if (this.isExploded) {
      return;
    }

    const center = this.body.position;

    for (let i = 0; i < allBoxes.length; i++) {
      const currentBox = allBoxes[i];
      if (!currentBox || currentBox.body.isStatic) {
        continue;
      }

      const boxPos = currentBox.body.position;
      const delta = Vector.sub(boxPos, center);
      const distance = Vector.magnitude(delta);

      if (distance <= 0 || distance > EXPLOSION_RADIUS) {
        continue;
      }

      const direction = Vector.normalise(delta);
      const falloff = 1 - distance / EXPLOSION_RADIUS;
      const totalForce = max(EXPLOSION_MIN_FORCE, EXPLOSION_FORCE * falloff);
      const force = Vector.mult(direction, totalForce);
      Body.applyForce(currentBox.body, boxPos, force);

      // Add a direct velocity impulse so the explosion is always visible.
      Body.setVelocity(currentBox.body, {
        x: currentBox.body.velocity.x + direction.x * EXPLOSION_VELOCITY_BOOST,
        y: currentBox.body.velocity.y + direction.y * EXPLOSION_VELOCITY_BOOST
      });
    }

    Composite.remove(world, this.body);
    this.isExploded = true;
  }

The part of the code I am most proud of is the explosion function. I track the number of collisions for the barrel, change the barrel color from gray toward red as damage increases, and once it reaches the hit limit which I defined as a global variable I remove the barrel body and apply outward force and increase the initial velocity to nearby boxes.

Embedded Sketch

Milestones and Challenges

Stage 1:

Stage 1 screenshot

 

Focused on getting the core interaction working by spawning rectangles with the mouse, and getting used to coding with matter.js.

Stage 2:

Stage 2 screenshot

Added a grounded barrel in the center area to establish the main object of the simulation and changed the background color to something I liked. At this point I began thinking about aesthetics.

Stage 3:

Stage 3 screenshot 1

Stage 3 screenshot 2

Introduced the damage mechanic where the barrel changes from gray toward red with each hit. A key challenge was collision handling, especially making sure hits were counted reliably and visual feedback was clear.

Stage 4:

Stage 4 screenshot

Finally I combined everything and added the final explosion behavior after the hit threshold. I had to tune radius, force, and velocity boost values so the explosion effect was strong and readable without becoming too chaotic. I found that the force value was not as important as the initial velocity boost because of air resistance. Then I added text to show the controls and allowed the user to drag their mouse to spawn squares rather than only clicking and made the boundary cover the sides and not only the ground.

Reflection and Future Improvements

If I continue this project, I want to replace simple shapes with sprite-based visuals so the barrel and debris feel more game-like. I also want to explore whether Matter.js can be combined with a 3D workflow or whether a different engine would be better for a full 3D version. Future improvements could include sound effects, particle debris, and different barrel types with unique blast strengths.

Saeed Lootah – Assignment 9

Concept

When I read thought about tension over time the visual that came to mind was that of a ball being squeezed until it ruptured. I originally wasn’t sure how I was going to fit that into the simulation and how I would incorporate flocking behavior but as I went through the the work of Robert Hodgin I had an idea.

I liked the look of the flocking behaviour in 3 dimensions as well as the red lighting. I thought about having it contained inside and that ball I had in mind upon rupture the flocks inside would escape rapidly and slow down over time.

Sketch

(orbit control is enabled, click and drag your mouse to view from different angles when viewing the sketch)

 

Highlight

function releaseBoids() {
  confinementActive = false;
  released = true;
  releaseFrame = frameCount;

  for (const boid of boids) {
    let outward = p5.Vector.sub(boid.pos, centerPoint);
    if (outward.magSq() < 0.0001) {
      outward = p5.Vector.random3D();
    }
    outward.setMag(random(5.5, 8.2));
    boid.vel.add(outward);
    boid.maxSpeed = RELEASE_SPEED_START + boid.speedBias;
  }
}

Once the circle reaches a certain threshold radius it vanishes and the function above is called. It’s only about 15 lines but in my opinion it creates the most interesting effect in the entire simulation.

What it does is simply give each boid an outward vector force of a random magnitude between 5.5 to 8.2. In my opinion that range and those values make it just right that the effect is not too strong that it’s overwhelming but is not too weak that its underwhelming but is instead in my opinion satisfying.

Milestones

The Humble Sphere

My first step after creating the WEBGL canvas was to make the sphere which decreases in size over time. I chose a black background to give the most contrast for any effects I add later on. For the sphere I chose a light-blueish color mostly because of personal preference. The color of the sphere itself is not meant to have much meaning instead I want the attention to be on the boids and the decreasing size of the sphere.

Boids

These are the boids travelling in 3 dimensions. At this point in time they did not have any special effects. I wanted them to be read since I imagined the effect of squeezing to be like that of increasing heat. In the real world particles as they are trapped in a volume that is decreasing increase the amount they vibrate by and the vibration also known as brownian motion is what we call temperature. It’s how combustion occurs in diesel engines whereby the volume decreases as the cylinder compresses the fuel air mixture and unlike gasoline there is no spark plug the combustion alone is enough to combust (this is known as compression ignition).

Boids + Sphere

The way I had the boids stay inside the sphere at this point was by hard limiting the position of the boids rather than manipulating the steering direction as well (I implemented that as well later on).

 

Final

In the final I made a few key improvements. I added multiple gradients and made the blendMode ADD: The color of the boids change based on their speed, at their fastest they are purple, at their slowest they are red, the radius of the boids change based on how close or far they are to the center of the sketch, bigger towards the center and smaller towards the edges, lastly blendMode(ADD) meant that with more boids overlapping the colors get added and make a whiter color this makes it easier to see when boids are more clumped up together especially at the beginning.

Reflection

I’m happy with how the effect turned out and I was surprised by the swirling effect that came about in the beginning. As the circle decreased the boids still moved as they would normally albeit constrained and this made it look like they were swirling around the circle.

Hodgin’s work was made in Houdini which utilized the GPU and made the effect of using point lights possible with large amounts of flying objects whereas in p5 I don’t think the same would be possible and still retain a high framerate. I found that for my simulation around 600 was the limit for the number of boids whilst still running fairly well and that was without any lighting effects the best I could do was add the blendMode(ADD) line to make things more aesthetically interesting.

Going forwards I would like to try out different software maybe openFrameworks to see what is possible and also I want to try out audio cues for when the sphere vanishes and for boids that are travelling close to the camera and based on how fast they are going.

Saeed Lootah – Assignment 7

Concept

When looking through the teamLab installations the one I found the most visually striking was the Koi fish installation:

I unfortunately wasn’t able to attend the trip to teamLab but regardless I found this image from which I took inspiration.

Milestones

Unfortunately because of the dark background it is hard to see some of the smaller details I’m referring to but it should be possible to see.

Stage 1

Although this step is fairly basic in my opinion it is still worth mentioning. At the very beginning I made the canvas render using WEBGL and added orbitcontrol() this would be useful for all of the following steps.

Stage 2 – Koi Fish (Random Motion)

At this point I created the Koi fish class. It’s called Koi but really the logic is like the vehicles we discussed in class. At this point in time the motion was fairly random but later the motion would become more fluid and they would travel in flocks using boid logic.

I thought about using more intricate fish models rather than simple dots and trails but I felt that strayed too far from the original installation since I didn’t get the impression that it was about the aesthetics of the fish but rather the movement and colors.

Stage 3 – Trails + Boid Movement

Now each ‘fish’ had a trail to make it’s movement more apparent. This also came with a surprising feature. To make sure that the fish would not go too far I had it so that at a certain distance it would return back to the opposite side but this had an unintended effect of this line that would go across the canvas (one can be seen at the back). I liked the look of it so I decided to keep it.

Boid movement refers to movement that replicates the movement of birds. Although it would normally be used to simulate birds moving in flocks I decided it could work in this sketch since I wanted them to move in flocks. In this screenshot the cohesion was set to a lower value but I would later try out different settings.

Stage 4 – 3D Player + Light Interaction + Avoidance

Afterwards I decided to add the ‘player’. In the original picture there are two people and I’m unsure how the fish act around the two people whether they follow them or always circle them but I decided to do something similar by having the fish avoid the player. In addition the color from the fish would illuminate the player accordingly.
The player can move around using either WASD, or the arrow keys and using shift/control to move up and down respectively.

Stage 5 – HTML UI Controls

Lastly, I added UI controls. You can edit the settings in real time by adjusting the sliders. I coded this part using html so that the UI would be 2D since the WEBGL sketch is 3D the UI wouldn’t follow the camera view. I am sure there is a way to still use p5 to achieve the same effect but I think this turned out well regardless.

Code Highlight

avoidPlayer(actor) {
    const off = wrappedOffset(this.pos, actor.pos);
    const d = off.mag();
    if (d > actor.fleeRadius || d < 0.0001) return;

    const strength = map(d, actor.fleeRadius, actor.radius, 0.0, 1.0, true);
    const flee = off.mult(-1);
    flee.normalize();
    flee.mult(this.maxSpeed * (0.8 + strength));
    flee.sub(this.vel);
    flee.limit(this.maxForce * (1.6 + strength));
    this.applyForce(flee);
  }

I highlighted this code because it makes each koi fish avoid the player in an organic way. When the player enters the fish’s flee radius, the fish measures that distance, converts it into a stronger or weaker escape response, and then applies a limited steering force so it turns away smoothly instead of abruptly.

Embedded Sketch

Viewing the sketch directly through the web editor and in window size achieves the best effect.

Reflection and Ideas for Future Work

I’m happy with the movement of the fish and the minimalistic aesthetic but there’s still room for improvement in the future. The following is what I would add in the future if I were to continue on this project or do something similar.

  • Add optional shader-based water caustics and soft volumetric fog.
  • Introduce seasonal color palettes as a toggleable mode.

Saeed Lootah – Assignment 8

Concept and Inspiration

My concept started from noticing that steering behavior looked very similar to moving ants. That gave me the idea to build an ant colony cross section simulation where ants travel between underground burrows and the surface like a terrarium. The simulation uses seek behavior and path following to create ant-like movement patterns.

Code Highlight

The part I am most proud of is the state machine in my Ant class. It coordinates each ant going to the surface, roaming, returning to the path, going back to the burrow, waiting, and repeating. Adding the returnToPath state was especially important because it fixed the issue where ants were cutting through the ground instead of reconnecting to the path at the surface first.

if (this.state === "roamSurface") {
    if (!this.roamTarget || p5.Vector.dist(this.pos, this.roamTarget) < 16) {
      this.pickNewRoamTarget();
    }
    this.seek(this.roamTarget);
    if (millis() >= this.roamUntil) {
      this.state = "returnToPath";
    }
    return;
  }
if (this.state === "returnToPath") {
    const overgroundJoin = this.path.getEnd();
    this.seek(overgroundJoin);
    if (dist(this.pos.x, this.pos.y, overgroundJoin.x, overgroundJoin.y) < 20) {
      this.state = "returnToBurrow";
    }
    return;
  }

Embedded Sketch

Milestones and Challenges

Stage 1: One path, one ant

 I started with one ant following one path to verify that seek and basic path following were working correctly.

 

Stage 2: One path, multiple ants


After confirming the basic behavior, I added multiple ants on the same path to test how the movement looked in a colony-like flow.

 

Stage 3: Multiple paths, multiple ants per path


I expanded the colony to multiple burrows and assigned ants to specific paths so each group had a consistent route to and from the surface.

 

Stage 4: Ground/grass visuals and roaming fix

I added the dirt and grass cross-section layout and introduced a roamSurface state for surface movement. A key challenge was that ants sometimes traveled through the ground when returning. I fixed this by adding a returnToPath state so ants first reconnect at ground level and then follow the tunnel path back down.

 

Reflection and Future Improvements

Right now, over ground, the ants still look like they are flying rather than staying fully attached to the ground surface. A future improvement would be to constrain or project overground motion to a ground contour so movement feels more realistic. I would also like to continue the ant colony idea further, or expand this into a larger ecosystem simulation with multiple interacting species and behaviors.

Midterm Project – Saeed

Project Overview

For my project I wanted to replicate the movement of birds between trees and their pathing around obstacles, in this case mountains. To best visualize this I started with a top-down view and then to make the sketch more visually pleasing I used contour lines to replicate a topographic map.

I was first inspired to do a simulation with birds by the garden outside my house in Dubai. Put simply there are a lot of birds, many are pigeons or sometimes crows as far as I can tell. From what I’ve seen most birds travel in packs and follow a similar direction to each other and at our house they often travel from tree to tree within the garden or with the trees lining the walls of my neighbours then they may stay at a tree for a while before moving again but always following a similar path in the air. Similarly, I see some birds stop at our pool to drink water often altogether. I wanted to replicate this behaviour in my project.

In addition, after deciding to do the simulation from a top down view I decided to add contour lines and make it appear to be a topographic map because it’s an idea I have been exploring for a while. I first learned how to make basic contour lines in photoshop using noise and since wanted to find places to use it since I feel it doesn’t get as much use as it should.

Implementation Details

This simulation is built around trees as emitters, birds as moving agents, a flow field for natural motion, mountains as obstacles that affect the vector field, contour lines for terrain, and a UI panel that lets you change parameters in real time.

Bird class

The Bird class represents one moving agent that travels between trees. Each bird stores position, destination, destination tree index, arrival/wait state, size, trail history, and a day-only behavior flag.

  • A destination tree is chosen randomly.
  • The bird travels to its destination.
  • When they arrive, they wait for a random duration.
  • After waiting, they pick a new destination tree at random.
  • Their movement is not only direct-to-target; a vector field is used for more natural movement.
class Bird {

  constructor(originX, originY, destinationX, destinationY, destTreeIndex, diameter = 10) {

    this.pos = createVector(originX, originY);

    this.des = createVector(destinationX, destinationY);

    this.destTreeIndex = destTreeIndex;

    this.arrived = false;

    this.trail = [];

  }

}

 

Tree class

The Tree class is both a visual node and an emitter for birds. Each tree has x and y coordinates, a diameter, and its own bird array.

  • A tree can initialize a number of birds.
  • Each spawned bird gets this tree as origin and a different tree as destination.
  • The tree updates and draws all birds in its own array.
class Tree {

  constructor(x, y, diameter = 40) {

    this.x = x;

    this.y = y;

    this.diameter = diameter;

    this.birdArray = [];

  }

}

The pathing starts with a direct vector from bird position to destination. Then the code samples the flow field at the bird position and then using vector addition it changes the heading of the bird itself.

 

let toDest = p5.Vector.sub(this.des, this.pos);

let desired = toDest.copy().setMag(birdSpeed);
let flow = getFlowAt(this.pos.x, this.pos.y);

let steer = desired.add(flow.mult(0.6));

steer.limit(birdSpeed + 0.5);

this.pos.add(steer);
function updateFlowField() {

  let noiseScale = 0.02;

  let time = frameCount * 0.005;

  for (let y = 0; y < flowFieldRows; y++) {

    for (let x = 0; x < flowFieldCols; x++) {

      let angle = noise(x * noiseScale, y * noiseScale, time) * TWO_PI * 2;

      flowField[y][x] = p5.Vector.fromAngle(angle).mult(0.8);

    }

  }

}

Mountain generation

Mountains are generated by sampling terrain elevation candidates from noise (to make spawning more natural as opposed to complete randomness), sorting candidates by highest elevation first, then placing mountains with spacing constraints.

const candidates = buildMountainCandidates();

for (let i = 0; i < candidates.length && mountains.length < quantity; i++) {

  const c = candidates[i];

  const baseRadius = map(c.elevation, 0, 1, mountainMinRadius, mountainMaxRadius, true);

  const radius = baseRadius * random(0.9, 1.12);

  const x = c.x;

  const y = c.y;

  if (isMountainPlacementValid(x, y, radius, 1, true)) {

    mountains.push({ x, y, radius });

  }

}

Mountains do not rewrite the global flow field grid directly. Instead, there’s an additional tangential force that gets added to the bird’s steer direction, not the vector field.

const repelStrength = map(edgeDistance, 0, 140, 2.8, 0.2, true);

const repel = away.copy().mult(repelStrength);

const tangent = createVector(-away.y, away.x);

if (tangent.dot(desiredDirection) < 0) {

  tangent.mult(-1);

}

 

Contour lines and how they work

Contours are generated from a hybrid elevation function that combines base noise terrain with mountain influence.

  • Build contour levels between minimum and maximum elevation.
  • For each cell, compute corner elevations.
  • Build a case mask and map it to edge segments.
  • Interpolate exact crossing points on edges.
  • Store line segments and then stitch/smooth them into polylines.
  • Draw major and minor contours with different stroke weights and alpha.
if (cell.tl.v >= iso) mask |= 8;

if (cell.tr.v >= iso) mask |= 4;

if (cell.br.v >= iso) mask |= 2;

if (cell.bl.v >= iso) mask |= 1;

const segments = caseToSegments[mask];
const a = getCellEdgePoint(edgeA, cell, iso);

const b = getCellEdgePoint(edgeB, cell, iso);

contourSegmentsByLevel[levelIndex].push({ a, b });

 

How the UI works

The UI is created in setupUIControls() as a fixed bottom panel.

  • Scene and simulation control (refresh noise seed, pause/resume).
  • Canvas preview and scale.
  • Mountains enabled and mountain count.
  • Tree count and birds per tree.
  • Bird speed and trail length.
  • Day/night cycle toggle, manual time selection, and day duration.
  • Contour smoothing iterations.

 

When values change, callback handlers rebuild affected systems. For example:

  • Tree count change regenerates trees and birds, then mountains.
  • Birds per tree change regenerates birds only.
  • Mountain toggle/count updates mountains and rebuilds contours.
  • Contour smooth input rebuilds contour polylines.

 

Example:

ui.birdCountInput.changed(() => {

  birdsPerTree = parseBoundedNumber(ui.birdCountInput.value(), birdsPerTree, 1, 20);

  ui.birdCountInput.value(String(birdsPerTree));

  regenerateBirdsForTrees();

});

States

Last but not least there are multiple states in the program. Day and Night, and with or without mountains. I decided to keep the states simple for this project. Day and night have visual changes and mountains affect the movements of the birds.

Milestones

Version 1

This was one of my first versions of my sketch it consisted of trees that were randomly spawned around the canvas (later I would switch to using random) and birds that travelled inbetween and there is the vector field although it is hard to notice in this image. There are some small details I implemented as well like birds would stop at the edge of the tree and I made the opacity of the background low so that I could see the paths of the birds more clearly.

Version 2

At this point I created the Mountain class, it would be different sizes and I just had it spawn randomly around the canvas at this point and I could change how many would spawn. As you can see at this point I didn’t implement the avoidance as you can see by the trail of one bird at the bottom that phased through the mountain.

Version 3

From my perspective at this point I had overcome the most important technical features. I now had birds that travelled between trees that moved organically and could avoid mountains and I was happy with the effect but I knew there was potential to make it more aesthetically pleasing but I didn’t know what my next step would be.

Version 4 (Daytime and Nighttime)

At this point I had simply implemented two states of it either being day or being night. Later I added a version of birds which would only move during the day and would stay at their trees during the night and a transition effect from day to night.

Version 5 (User Interface)

I took all the features I had before and now I added a user interface in the form of sliders at the bottom of the canvas. Because it changed the canvas in real time it allowed me to see different variations and it led to the idea of making the mountains spawn using noise but only at points of ‘highest elevation’. This later led to the idea of the topographic map.

Version 6

I started with just topographic lines with the noise function alone (not taking into consideration the placement of mountains) then after tweaking the strokeWeight to make it more visible I added the mountains  in the form of the contour lines. Then I added colors based on the elevation opting for a sand color and for the areas of low elevation a blue color to represent water and for the mountains a gray color but I later tweaked the gray color and sand color to make it more prominent.

Version 7

This is the latest version of my sketch. From the last version to this I added a few more elements that can be changed from the UI (not shown in this screenshot) and

(Looks better when viewing it on a larger canvas)

Video Documentation

Reflection

I’m happy with the movement of the birds especially the avoidance of the mountains. However, I wasn’t able to get an A3 sized screenshot of the simulation because it is too slow when scaled to be A3 sized. I’m not sure why exactly I haven’t taken the time to sit down and figure it out but I have a suspicion that its because of the size of the vector field and all the vector calculations. I’m just guessing.

I would also want to make improvements on the aesthetics. I thought about shadows and maybe having more detailed models for the birds and trees but I wasn’t sure how to without going against the topographic map aesthetic.

References

  • p5.js library — p5js.org
  • Daniel Shiffman, The Nature of Code: vector movement and flow fields (natureofcode.com)
  • AI disclosure: To assist with the mountain repulsion force implementation and the contour line generation using the Marching Squares algorithm

Saeed Lootah – Midterm Progress

Inspiration

At home in Dubai there are many palm trees around where I live. Since I was young I’ve always noticed birds travelling between trees and staying at some trees for a short while only to move again later. I always found their movement very calming so I wanted to replicate this in code and hopefully evoke the same emotions.

Core Concept and Design

The sketch simulates a flock of birds moving between trees. Trees are placed using the noise function where at the highest points in the noise function a tree is placed. Birds spawn at trees, fly toward a randomly chosen tree, and upon arrival wait 1–3 seconds before selecting a new destination. Their movement is influenced by a flow field (vector field) to create more organic movement.

 

Technical Implementation

This was my first significant milestone. By this point I had created the tree class and the bird class. The bird class has functions which I copied from the Mover class we have used/replicated in previous lectures. I did at first use the random() function for the placement of the trees but would later implement the noise function I mentioned earlier.

This is the current stage that I’m at. I added the vector field and changed the colors of the birds to brown without any stroke. The deviations in movement from the vector field were small at first but it was only after some time and by taking advantage of lowering the opacity of the background that I was able to notice the subtleties.

Future Improvements

Planned extensions to expand the system into multiple distinct modes:

1. No Mountains — Current baseline state
2. Mountains with Repulsion — Randomly placed mountains that birds navigate around using repulsion forces
3. Central Tree — A focal tree where birds travel back and forth in a more structured pattern
4. Nighttime / Daytime — Different visual states (e.g., color palette, lighting) to produce varied aesthetic outputs

These modes will increase the variety of visual outputs and support the requirement for multiple distinct operating states.

References

p5.js — p5js.org for the creative coding environment
Perlin Noise — Ken Perlin’s noise algorithm for smooth, natural-looking randomness
Flow Fields — Technique commonly used in generative art and particle systems

(although the noise() function in p5js is not exactly noise its still worth referencing)

Saeed Lootah – Assignment 4

Concept

For this Assignment I was unsure where to start for a long time. I knew I was going to use sound in some way and that of course there would be harmonic motion. I also wanted to make my animation centered around a circle in some way.

Whilst reading through Memo Atken’s work on Simple Harmonic Motion I noticed at the end he mentioned “the fourier series”. For a physics experiment back in high school at one point we used a technique called the Fast Fourier Transform (FFT) to analyze audio by breaking it up into individual sin waves of multiple frequencies which when added together would produce the sound that we were hearing. After some searching I realized that p5.js has the FFT feature built in and so I decided I would use it.

Sketch

Unfortunately there are some small issues: Microphone doesn’t work unless you give permission through your browser, and as for the sound because I had to compress it to be less than 5 MB it doesn’t have the full range of frequencies that it’s supposed to.

Highlight
fft.analyze();
  fftArray = fft.linAverages(resolution);

  let x = width / 2;
  let y = height / 2;
  let radius = 180;

  for (let index = 0; index < resolution; index++) {
    let progress = index / resolution;
    let theta = TWO_PI * progress;

    push();
    let selected = frequencyArray[index];
    translate(x, y);
    rotate(theta);
    noStroke();
    // fill(0, 0, 255);
    selected.show(radius, 0);
    selected.updateFFT(fftArray[index]);
    pop();
  }

This is my personal favorite part of the code. What it does is initialize the

Milestones

This was my initial experiment with using harmonic motion for a rectangle. I created a class called FreqLine (which I ended up using at the end) and all it did was have a sin wave which would change the height of the rectangle accordingly.

This is my second milestone, after getting the rectangles to work I made multiple and turned them into a circle shape. Each rectangle followed a different frequency which created an interesting effect as they rectangles would start in phase of each other and then go out of phase and come back.

At this point I experimented with different rectMode()’s trying CENTER, TOP, BOTTOM. Top and bottom gave the same result (which is what you see in the screenshot) whereas for CENTER it wasn’t as interesting in my opinion.

In the end I decided I would use TOP but I would make the width’s negative so that the rectangles would increase towards the center as I enjoyed that look the most.

Reflection

I wish I didn’t stop myself from starting. In the beginning I wasn’t sure what to do and I felt that if I didn’t have an idea that was good enough I wouldn’t start. Looking back and for upcoming assignments I plan on starting with any idea and being willing to do multiple sketches if I have to. It wasn’t until I had started with the circle idea that I thought to use the FFT and it wasn’t until I started the FFT did I decide to include the option to alternate between a song and a microphone. While they seem simple I think with the midterm coming up its important that I start experimenting.