Final Project – Presentation

Where It Started

During a class field trip to teamLab, I picked up a pencil drawing of a butterfly, colored it yellow, and slid it under a scanner. Seconds later it appeared on the floor, glowing and drifting between hundreds of other visitors’ creatures. Every other room at teamLab was something you walked through. This one was something you contributed to.

For Assignment 7, I tried to recreate that world in code. It looked alive. But it was passive. Nothing responded to the user. This project fixes that.

What It Became

The single butterfly becomes a flock of mixed creatures that chase the mouse, flock together, and trail light in night mode. The floor grows coral wherever the flock lingers and slowly recedes when they leave. The world shifts between warm dusk and deep night based on how spread out the flock is. And it listens through the microphone: a clap or any loud sound scatters the flock and darkens the world instantly.

How to Interact

Move the mouse to attract the flock. Left click to drop food. Right click to release a disturbance. Press S to scatter, C to calm, D to toggle night mode, and N to bloom coral at the mouse position. Three sliders in the top left control flock size, coral speed, and current strength.

Video Presentation

https://www.loom.com/share/c1f8f08127154339aac409b17878e322

Embedded Sketch

FInal Project Progress

Overview

This post documents the development progress on my final project, a direct and substantial expansion of Assignment 7. The core idea is to take the passive, decorative floor world from that assignment and transform it into a living ecosystem the user can shape in real time. The flock moves through the same visual world as the original sketch, the floor creatures react to what passes above them, coral grows and recedes based on where the flock lingers, and the entire mood of the world shifts between warm dusk and deep night based on how the flock behaves. On top of all of that, the microphone listens to the room, and sound directly disturbs the ecosystem.

At this stage, all the major systems are built and running together. What remains is the UI sliders panel and a final visual polish pass.

Milestone 1 — Environment

The first thing built was the environment with no agents at all. The goal was to get the visual foundation right before anything moved.

The background is a gradient that runs from warm amber at the top to deep indigo at the bottom. Unlike Assignment 7 where the background cycled through palettes on a fixed timer, this one responds to the flock’s behavior — but at this stage it was set to dusk to verify the colors. Perlin noise generates god-ray light shafts that taper downward from the top of the canvas, shifting slowly each frame. Five large ambient light blobs drift across the scene, carried directly from Assignment 7. The perspective vanishing-point floor grid sits at the bottom third of the canvas with a warm glow rising from below.

The first challenge here was getting the god-rays to feel like light through water rather than painted stripes. The fix was using noise to vary the width and brightness of each ray independently rather than giving them a uniform appearance, so some are wide and bright and others are thin and barely visible. The result reads as diffused underwater light.

Milestone 2 — Floor Creatures

With the environment stable, the floor creatures from Assignment 7 were brought in next. All four designs — flowers, fish, lizards, and swirls — wander across the floor using the same wander steering from the original sketch: tiny random velocity nudges each frame produce organic, unpredictable paths that feel alive rather than mechanical.

At this point the sketch looks almost identical to Assignment 7. That is intentional. The point of this milestone was to confirm the visual foundation was intact before adding the new behavioral layer on top of it. Everything from the original — the wobble animation offsets, the soft vertical boundaries keeping creatures on the floor, the creature drawing code — carried forward without modification.

Milestone 3 — Flock and Flow Field

This is where the project becomes something genuinely new. The single butterfly from Assignment 7 is replaced by a flock of 65 agents, each randomly assigned one of the four creature designs. They are all the same behavioral system underneath but look completely different from one another, so a tight cluster contains spinning flowers next to fish next to swirling orbs all moving as one body.

The flock uses three steering forces running simultaneously: separation keeps agents from crowding each other, alignment steers each agent toward the average heading of its neighbors, and cohesion pulls each agent toward the average position of its neighbors. On top of those, a Perlin noise flow field applies a gentle base current to every agent, and the mouse acts as a continuous attractor — the flock swims toward the cursor at all times. Moving the mouse slowly produces a calm trailing formation. Moving it quickly makes the flock chase and spread.

The floor creatures gained awareness at this milestone too. Each one finds the nearest flock agent every frame and, if it is close enough, turns toward it and glows. The glow fades smoothly when the flock moves away.

The most technically interesting part of this milestone was blending the flocking forces with the mouse seek force without the mouse completely dominating. The solution was limiting the seek force to a lower multiplier than separation and cohesion, so the flock follows the mouse as a group rather than every agent rushing toward the cursor independently.

Milestone 4 — Coral Cellular Automata and Food

The coral system is what gives the floor memory. A grid of cells covers the floor zone. Each cell tracks how long the flock has spent above it. Sustained presence causes cells to grow through three visible stages: a small dim polyp, a growing mid-stage coral with layered orange and amber ellipses, and a full bloom with a glowing halo that intensifies in night mode.

Left-clicking drops a food source at that point. The flock detects it and converges tightly, clustering directly above it. A dense cluster above a cell accelerates its growth noticeably. The food dissolves after a few seconds and the flock reforms. Clicking several times in the same area produces a visible coral patch that persists even after the flock moves away, slowly receding over time.

The mouse also interacts directly with the coral independent of the flock. Cells near the cursor pulse with a slow breathing glow driven by a sine wave, and cells directly beneath a stationary mouse grow faster than cells the flock is simply passing over.

The challenge in this system was performance. Checking every flock agent against every coral cell every frame is expensive. The fix was sampling only agents within a fixed radius of each cell rather than checking the full flock, which kept the frame rate stable even with 65 agents and a dense grid.

Embedded Sketch

What Still Needs to Be Built

A UI sliders panel in the top right corner giving the user direct control over flock size, coral growth speed, and current strength. This is the last remaining interactive layer before the project is complete.

A final visual polish pass on the creature drawing, the coral bloom stages, and the transition smoothness between dusk and night.

Reflection on Progress

The most surprising result of building this was how much the mood system changed the feeling of the sketch. In early tests without it, the world was visually interesting but static in character — it always looked the same regardless of what the user did. Once the mood system was connected, the world started responding with a personality. A chaotic session where the user keeps pressing S and right-clicking feels genuinely different from a calm session where the flock clusters quietly and coral builds up slowly on the floor. The same code produces two completely different emotional experiences depending on how the user interacts.

The microphone input was the most technically uncertain part of the build. p5.sound requires the library to be loaded separately in the p5.js editor and the browser must grant microphone permissions before any level data is available. Once running, the effect is immediate and visceral — speaking or clapping near the laptop produces a visible shockwave through the flock that no mouse interaction can replicate.

References

Shiffman, D. (2024). The Nature of Code, v.2. https://natureofcode.com

Reynolds, C. (1999). Steering behaviors for autonomous characters. Game Developers Conference.

Wolfram, S. (2002). A New Kind of Science. Wolfram Media.

Assignment 7, teamLab “Color Your World” recreation (Bismark Buernortey Buer) — direct visual foundation.

Assignment 8, The Shoal (Bismark Buernortey Buer) — flocking system reference.

p5.sound library documentation. https://p5js.org/reference/p5.sound

AI disclosure: Claude (Anthropic) was used to help structure and articulate this progress documentation. 

Buernortey – Final Project proposal

Concept

During a class field trip to teamLab, visitors could pick up a pencil drawing of a butterfly, flower, or lizard, color it in, and slide it under a scanner. Seconds later, their drawing appeared on the floor, glowing, animated, and moving freely through all the other visitors’ creations. I chose a butterfly and colored it yellow. Watching that specific butterfly appear on the floor and drift between everyone else’s drawings was unlike anything else in the installation. Every other room at teamLab was something you walked through. This one was something you contributed to. The floor felt like a collective painting that no single person made, a shared canvas where hundreds of people’s choices coexisted at once.

That feeling is what I tried to recreate in Assignment 7. The sketch built a glowing, color-shifting floor world populated by wandering creatures: flowers rotating their petals, fish with tails and fins, lizards with wagging tails and four legs, and swirling orbs of orbiting light, all drawn entirely in code with no images. A single yellow butterfly entered from the left edge and drifted organically through the crowd. The background cycled through deep blue, purple, and magenta gradients, with large soft blobs of colored ambient light drifting across the floor to simulate the projected pools of color from the real installation.

It looked alive. But it was passive. Nothing responded to the user. Nothing changed based on what anyone did. The world was the same at the end of a session as it was at the beginning. In my own reflection on that assignment I noted exactly what was missing: flocking behavior, user control, convincing depth, and a sense that the world had memory. This project delivers all of it.

The single butterfly becomes a flock of dozens of mixed creatures, all drawn from the same visual vocabulary as before but now governed by separation, alignment, and cohesion forces. The flock swims directly toward the user’s cursor at all times, so the mouse becomes a living presence in the water rather than a control dial. The floor creatures react to the flock above them. Where the flock lingers, coral grows on the floor through a cellular automata system, pulsing and glowing when the mouse hovers nearby and blooming faster when the mouse stays still. The entire world’s lighting mood shifts continuously between warm dusk and deep night based on how dense and calm the flock is. And the world listens: the microphone picks up sound in the room, and loud sounds scatter the flock like a shockwave while quiet lets the coral bloom undisturbed.

The user is no longer watching. They are conducting.

What Carries Forward

The visual foundation is preserved entirely. The perspective vanishing-point floor, the gradient background that cycles through color palettes, the five ambient light blobs drifting across the scene, and all four creature designs carry forward without modification. The hand-coded creature drawing system, the wobble animation offsets, the bezier butterfly wings, and the soft boundary system that keeps creatures on the floor plane all remain. The additive blending glow technique used for the butterfly trail becomes the basis for the coral bioluminescence and the mood lighting system.

What changes is everything above the visual layer: behavior, interaction, memory, and mood.

The World

The environment is the same floor world, now reframed as a deep-sea scene at dusk. The background gradient runs from warm amber at the top to deep indigo at the bottom and responds to flock behavior rather than cycling on a fixed timer. Perlin noise generates slow-moving god-ray light shafts that cut through the scene each frame, their intensity tied to the current mood state.

The flock occupies the mid-canvas zone. Each agent is randomly assigned one of the existing creature designs, so the flock is visually diverse: a tight cluster might contain spinning flowers next to fish next to swirling orbs, all moving as one body under the same steering forces. They are all the same behavioral system wearing different costumes.

The coral cellular automata grid covers the floor. Each cell tracks how long the flock has spent above it. Sustained presence causes cells to grow through stages from bare floor to young polyp to full coral bloom, drawn using layered colored ellipses with additive blending. Cells the flock abandons slowly regress. The floor is a living record of the session. Coral cells near the mouse cursor pulse with a slow breathing glow, and cells directly beneath a stationary mouse grow noticeably faster.

The mood system reads the average spacing between flock agents every frame and produces a single value between 0 and 1, where 0 is fully calm and dense and 1 is fully scattered and disturbed. This value drives the background palette interpolation, the intensity of the god-ray shafts, the brightness of the ambient blobs, and the prominence of the coral glow. A calm flock means warm amber dominates. A scattered flock means the scene goes dark and the coral becomes the primary light source. The transition is continuous and never snaps abruptly.

Interaction

The project uses four simultaneous input channels. All four work at the same time and all four produce visible consequences in the world.

The mouse is the primary presence in the water. The flock swims toward the cursor continuously, so wherever the mouse moves the school follows. Moving slowly produces a calm, trailing formation. Moving quickly makes the flock chase frantically and spread out. Hovering still over the floor makes the coral beneath pulse and grow faster.

The microphone listens to the room through p5.sound. Quiet input lets the world stay warm and the coral bloom undisturbed. A loud sound, whether a clap, a voice, or music playing nearby, sends a shockwave through the flock, scattering every agent outward and darkening the world toward deep night instantly. Sustained loud input keeps the world in night mode. When the room goes quiet again the flock slowly reforms and the light returns. The amplitude of sound maps directly and continuously to the disturbance force on every agent.

The keyboard gives the user three dramatic state changes. Pressing S fires an explosive scatter force outward from the center of the flock. Pressing C activates a calm force that pulls every agent toward the flock’s center and slows them down, shifting the mood toward dusk. Pressing D forces the world into deep night immediately, making the coral glow at full intensity and dimming everything else to near black. Pressing N drops an instant coral bloom at the current mouse position.

Left-clicking drops a food source that the flock converges on, triggering rapid coral growth and a bioluminescent pulse spreading outward across the floor creatures. Right-clicking releases a disturbance ring that scatters the flock and darkens the world.

Course Concepts

Perlin noise drives the flow field the flock steers through and the god-ray light shafts shifting above. Vector forces govern the seek force pulling agents toward the cursor, the sound-driven scatter shockwave, the food attraction force, and the keyboard scatter and calm impulses. Oscillation animates the creature wobble, the coral pulse rhythm, and the breathing quality of the mood lighting transitions. Particle system techniques handle the bioluminescent glow on the coral and the bloom effects using additive blending, carried directly from the butterfly trail in the previous assignment. Autonomous agent steering powers the independent wander behavior of the floor creatures and their seek response toward the flock. Flocking is the core behavioral engine governing the flock’s separation, alignment, and cohesion. Cellular automata drives the coral growth and regression on the floor grid, with mouse proximity and sound amplitude both feeding into the growth rate.

Every system depends on at least one other. The flocking depends on the flow field. The coral depends on the flock. The mood depends on the flock density. The god-rays depend on the mood. The microphone feeds into the scatter force, the mood, and the coral suppression simultaneously. Nothing runs in isolation.

References

Shiffman, D. (2024). The Nature of Code, v.2. https://natureofcode.com

Reynolds, C. (1999). Steering behaviors for autonomous characters. Game Developers Conference.

Wolfram, S. (2002). A New Kind of Science. Wolfram Media.

Gardner, M. (1970). Mathematical games: The fantastic combinations of John Conway’s new solitaire game “Life.” Scientific American, 223(4), 120–123.

Assignment 7, teamLab “Color Your World” recreation (Bismark Buernortey Buer): https://decodingnature.nyuadim.com/2026/03/24/buernortey-assignment-7/

Kurokawa, R. — tension and release in generative audiovisual systems.

Hodgin, R. — flocking as visual composition.

p5.sound library documentation. https://p5js.org/reference/p5.sound

AI disclosure: Claude (Anthropic) was used to help develop and articulate the project concept and structure this proposal documentation.

Buernortey – Assignment 11

Concept

I grew up in a fishing community in Ghana. Every morning, fishermen would cast their nets into the water. The net would fan out in a wide circle, hang in the air for a moment, then hit the surface and sink. That image stayed with me.

This sketch simulates that moment using cellular automata. Clicking the canvas casts a net from that point. The net expands outward as a glowing ring. Cells are born at the edge of the ring, age as it passes through them, and fade back into dark water. Multiple nets can be cast at once. Their rings spread, overlap, and dissolve.

The water is dark. The threads glow warm white, then cool to teal, then disappear. It is meant to feel like watching from the shore at dawn.

Code Highlight

This is the castNet() function. It is the part I am most proud of.

function castNet(mx, my) {
  let ci     = floor(mx / CELL_SIZE);
  let cj     = floor(my / CELL_SIZE);
  let radius = 2;

  for (let di = -radius - 1; di <= radius + 1; di++) {
    for (let dj = -radius - 1; dj <= radius + 1; dj++) {
      let dist = sqrt(di * di + dj * dj);
      if (dist >= radius - 0.5 && dist <= radius + 0.5) {
        let ni = (ci + di + cols) % cols;
        let nj = (cj + dj + rows) % rows;
        grid[ni][nj] = 1;
      }
    }
  }
}

When a user clicks, a small ring of cells is planted. The ring shape is important. A single dot would expand as a filled blob. A ring seed gives the CA the right structure to start expanding as a hollow circle. The dist check selects only cells that fall on the boundary of that radius. This is what makes it look like a cast net rather than a spreading stain.

The rules in stepGrid() do the rest:

if (cur === 0) {
  if (youngNeighbors === 2 || youngNeighbors === 3) next = 1;

} else if (cur === 1) {
  if (youngNeighbors === 1 || youngNeighbors === 2) next = 1;
  else next = 2;

} else {
  next = cur + 1;
  if (next >= NUM_STATES) next = 0;
}

Cells on the outer edge of the ring always have enough young neighbors to birth new cells one step further out. Cells on the inside do not. This is what causes the ring to expand rather than fill.

Embedded Sketch

Controls:

  • CLICK to cast a net
  • SPACE to play or pause
  • C to clear the water
  • UP / DOWN to change thread length
  • [ and ] to change speed

Milestones and Challenges

Starting with the ring seed. The first version planted a single cell on click. It expanded as a diamond-shaped blob. It looked nothing like a net. Switching to a ring seed of radius 2 changed the behavior immediately. The CA had enough structure to expand in a circle.

 

Getting the birth rule right. Using exactly 3 young neighbors caused the ring to break apart into fragments. Changing to 2 or 3 young neighbors kept the ring solid as it expanded. That one number made the difference between scattered dots and a clean circle.

The color palette. The colors needed to feel like water. I started with bright saturated colors and they felt wrong. I moved to warm white for fresh threads, cooling through teal, then fading back to dark water. The result feels like something glowing underwater.

Multiple overlapping nets. Casting two nets close together causes their rings to collide. Where they overlap, the birth rules create unexpected new patterns. This was not planned. It came from the rules themselves.

Challenges. The biggest issue was performance on a large canvas. Switching from a regular JavaScript array to Uint8Array improved frame rates noticeably. The toroidal wrapping also caused edge bugs early on. The modulo trick (x + di + cols) % cols handles cells at the canvas boundary cleanly.

Reflection

The moment the ring seed worked was when the project finally felt like something. Before that it was just another cellular automata sketch. Once the net started expanding from a click, it became connected to something real.

The overlap behavior surprised me. Two rings colliding was not designed. It came from the rules. The CA did something I did not ask it to do, and it looked right. That is what makes cellular automata interesting.

The sketch is quiet. Dark water, glowing threads, and a click. That is enough.

Ideas for Future Work

Add sound. The moment of casting could trigger a soft water sound. The expanding ring could carry a faint ambient tone that fades with the threads.

Add wind. A directional force could skew the ring as it expands so the net drifts the way a real net does in current.

Add retraction. After the ring reaches a certain size, it could collapse back toward the origin. This would complete the full cast and pull motion.

Add drag. Holding and dragging before releasing could control the direction and force of the cast.

References and Inspiration

Conway’s Game of Life — birth and survival logic.

Nature of Code, Chapter 7 — grid structure and neighbor counting.

Zach Lieberman — finding personal meaning in computational work.

Growing up in a fishing community in Ghana — the real inspiration.

Buernortey – Assignment 10

Concept

Erosion is a physics simulation of a rocky mesa breaking apart under falling debris. The idea came from thinking about how landscapes that look permanent are always being shaped by small, repeated impacts over time. I wanted to use Matter.js not just to show objects falling but to model a process. The terrain starts whole and breaks down based on how hard things hit it.

Code I Am Proud Of

The collision event handler is the center of the piece. It does not react to every contact. It reads the speed of the incoming rock and only shatters a terrain block when the impact crosses a threshold. This makes the simulation feel grounded. Slow rocks bounce off. Fast rocks break things.

 

Events.on(engine, 'collisionStart', function(event) {
  for (let pair of event.pairs) {
    let { bodyA, bodyB } = pair;
    let rock = null, ter = null;

    if (bodyA.label === 'rock' && bodyB.label === 'terrain') { rock = bodyA; ter = bodyB; }
    if (bodyB.label === 'rock' && bodyA.label === 'terrain') { rock = bodyB; ter = bodyA; }

    if (rock && ter) {
      let impactSpeed = Vector.magnitude(rock.velocity);
      if (impactSpeed > 3.5) {
        let idx = terrain.findIndex(t => t.body === ter && !t.broken);
        if (idx !== -1) shatterTerrain(idx, impactSpeed);
      }
    }
  }
});

 

I am also proud of the shatterTerrain function because it does three things at once when a block breaks. It triggers a screen shake scaled to the impact force, spawns a flash at the contact point, and sends out a burst of dust particles that drift upward and fade out. Each of those effects runs from the same impact speed value, so they all feel connected.

 

Embedded Sketch

Milestones and Challenges

The first milestone was drawing the terrain. I wrote a heightmap function that builds a mesa shape, peaks in the center, and slopes toward the edges, with small random offsets per column to roughen the silhouette. I added a sky gradient and a highlight strip on the top face of each block so the structure looked like real layered rock before any physics was involved.

The second milestone was loading that terrain into Matter.js as static bodies. Each block became a rectangle body placed at the correct position. Drawing them back from their body positions confirmed the physics world and the visual layer matched.

The third milestone was getting rocks to fall with physical variety. Rocks now vary in size, fall speed, bounciness, friction, and spin. Density scales with size so large boulders hit harder. Wind drifts slowly over time toward a random target every three seconds, and the visual rain streaks angle to match. This step exposed the first real challenge: collision filters. Without explicit category and mask values set on each body type, rocks passed straight through the terrain. Setting category: 0x0002 on terrain and mask: 0x0001 on terrain fixed which bodies could interact with which.

The fourth milestone was the full shattering system with visual feedback. When a collision event fires and the impact speed clears the threshold, the terrain block is removed, fragments spawn and tumble, dust particles burst outward, a flash appears at the contact point, and the canvas shakes. The main challenge here was performance. Without cleanup, thousands of fragment, dust, and rock bodies built up below the canvas over time and slowed the simulation. Adding a per-frame filter that removes any body or particle whose position exceeds the canvas height solved the problem completely.

Reflection

The piece works well as a slow process. Dropping one rock at a time and watching the terrain wear down has a satisfying quality. The storm mode makes the erosion visible in seconds and shows how the system handles high load. The wind system adds unpredictability without feeling random because it drifts gradually rather than jumping between values.

For future work I want to add water. Fragments that collect at the bottom could be slowly submerged as a rising water level fills the valleys left behind by the erosion. I also want to track how many blocks have been destroyed and show it as a live counter so the viewer has a sense of scale over the life of the simulation.

References

The Nature of Code, Chapter 6: Physics Libraries Matter.js documentation on collision events and body properties

Buernortey – Assignment 9

Concept

In the deep ocean, bioluminescence is not just decoration. Many creatures glow brighter when threatened. Firefly squid, siphonophores, certain jellyfish species all do this. The light is involuntary. It is a biological fear signal, and every creature nearby can read it.

That is the idea behind this sketch. It is a flocking system where light and movement share the same variable. Each boid’s proximity to a descending predator controls both how hard it steers away and how brightly it glows. Fear is brightness. The swarm illuminates itself at the exact moment it is most in danger.

The piece moves through three acts. First, creatures drift in near-total darkness, sparse, slow, barely visible. Then a shadow descends from above. Not fast, not aggressive, just a pressure. The boids closest to it sense it first and begin moving away, the fear-glow spreading outward through the swarm. Finally the predator arrives fully and the swarm explodes outward in a burst of light. Then silence. The survivors scatter into the dark, dimmer than before.

Embedded Sketch

Code I’m Proud Of

The piece’s central idea lives in about ten lines inside the Boid class. this.fear is a single float between 0 and 1 computed from distance to the predator. It does two jobs at once: it scales the steering force pushing the boid away, and it is read by draw() to amplify the glow radius and opacity.

// Inside applyPhaseForces(), descend phase:
let toPred   = p5.Vector.sub(this.pos, createVector(pred.x, pred.y));
let d        = toPred.mag();
let fearZone = 280;

if (d < fearZone) {
  this.fear = map(d, 0, fearZone, 1.0, 0);
  toPred.normalize().mult(this.fear * 0.38);
  this.acc.add(toPred);
} else {
  this.fear = 0;
}

// Inside draw():
const baseAlpha = 0.60 + fearGlow * 0.40;
const glowR     = this.size * 2.2 + fearGlow * 5;

The same number that moves the creature also lights it up. I did not need a separate brightness system. That reduction felt like the sketch finding its own logic rather than me imposing one.

Milestones and Challenges

Milestone 1: Basic flocking

The first working version was just the three steering forces running on a plain black background. No phases, no predator, no glow. Getting alignment, cohesion, and separation balanced took more time than expected. At equal weights the boids collapsed into a tight unmoving ball. I had to bring cohesion down significantly and give separation more authority before the swarm started feeling alive.

Milestone 2: Adding bioluminescent glow

Once the flocking was stable I replaced the flat ellipse with a radial gradient halo drawn through drawingContext. This is where the creatures started feeling like they lived underwater rather than on a screen. I also added the teardrop body shape and started assigning each boid a random hue in the cyan-to-violet range.

Milestone 3: The predator as pressure, not shape

My first predator was a solid dark ellipse and it looked like a game obstacle. The fix was removing any hard edge entirely and replacing it with a radial gradient that fades to nothing — an absence of light rather than a presence of shape. This one change made the whole sketch feel more like an environment and less like a simulation.

Milestone 4: Fear driving both movement and light

This was the central technical challenge. Once the predator was descending I needed the boids to respond to it — not just steer away, but glow brighter as they got closer. The insight was that these could be the same number. I computed this.fear as a map() of distance and fed it into both the physics and the renderer simultaneously.

Reflection

The fear-as-light mechanism worked the way I hoped. Watching the swarm light up at the moment of greatest danger, because of the danger, gave the piece a logic that felt biological rather than programmed.

A few directions I would take this further. Sound is the most obvious missing layer. The panic phase has a visual density that feels like it needs a corresponding audio response, something close to how Kurokawa synchronizes brightness and sound intensity. The predator could also be made reactive rather than scripted, hunting the brightest cluster in the swarm. This would create a feedback loop where fear-glow attracts the very thing the swarm is afraid of. Individual boid memory would also add depth. Creatures that were nearly caught could stay darker and more erratic for longer, while those that escaped early return to calm faster. Trauma as a behavioral variable rather than just a visual one.

References

Ryoichi Kurokawa’s audiovisual works were a direct influence, specifically how he treats light density as a rhythmic and emotional variable rather than aesthetic decoration. Robert Hodgin’s fluid creature systems shaped how I wanted the boids to feel: biological rather than mechanical. 

Buernortey – Assignment 8: The Shoal

Concept

Growing up in a fishing community in Ghana, I watched fishermen read the water. Not just the tides, but the fish themselves. A school of fish moves like a single organism, splitting around rocks, scattering from shadows, reforming behind the boat. No one fish is giving orders. The pattern emerges from each individual following a few simple rules about its neighbors.

That memory became the concept for this sketch. The Shoal is a system of 60 fish navigating an ocean current, staying together as a group, and reacting to a predator that tracks your mouse cursor. The behaviors are inspired directly by Craig Reynolds’ steering model: each fish knows nothing about the whole. It only senses what is close to it. Yet the group produces complex, lifelike motion.

The color palette is drawn from the Ghanaian flag (red, gold, and green), and each color encodes a live behavioral state. Green means a fish is flocking normally. Red means it is fleeing the predator. Gold means it has drifted away from the group and is wandering on its own.

A Highlight of Code I’m Proud Of

The piece of code I kept returning to is the flock() method, the behavior composer that decides frame by frame what each fish should be doing. What I love about it is how it uses the same underlying steering primitives in completely different combinations depending on context.

flock(others, pred, flow) {
  let d = dist(this.pos.x, this.pos.y, pred.pos.x, pred.pos.y);
  let fleeing = d < FLEE_RADIUS;

  if (fleeing) {
    // Run directly away from the predator at boosted speed
    let flee = this._flee(pred.pos);
    flee.mult(2.5);
    this.applyForce(flee);
    this.maxSpeed = FISH_MAX_SPEED * 1.5;
    this.bodyColor = PAL.red;
  } else {
    // Normal flocking: separation keeps spacing, alignment matches heading,
    // cohesion pulls toward the group center using arrive()
    let sep = this._separation(others);
    let ali = this._alignment(others);
    let coh = this._cohesion(others);

    sep.mult(1.8);
    ali.mult(1.0);
    coh.mult(1.2);

    this.applyForce(sep);
    this.applyForce(ali);
    this.applyForce(coh);

    // Flow field gives the fish a subtle current to drift with
    let flowForce = flow.lookup(this.pos);
    flowForce.setMag(this.maxSpeed * 0.6);
    let flowSteer = p5.Vector.sub(flowForce, this.vel);
    flowSteer.limit(this.maxForce * 0.5);
    this.applyForce(flowSteer);

    // Isolated fish switch to wander
    let neighbors = this._countNeighbors(others, COH_RADIUS);
    if (neighbors < 3) {
      this.bodyColor = PAL.gold;
      this.applyForce(this._wander());
    } else {
      this.bodyColor = PAL.green;
    }
  }
}

 

What made this click for me is that _arrive() is called inside _cohesion(), so even the group behavior uses the same arrive logic I first learned as a single vehicle seeking a target. One method, three behavioral contexts: cohesion toward the group center, the predator arriving at the cursor, and the wander behavior projecting a circle ahead of itself. Reusing the same primitive in different combinations was the most satisfying part of this assignment.

Embedded Sketch


Move your cursor to control the predator. Watch the shoal react. Fish nearest the predator turn red and scatter, isolated fish turn gold and wander, and the rest stay green and hold formation together.

Milestones and Process

Phase 1 — One fish, seek and arrive

I started with a single fish vehicle following the mouse. The goal was to make sure the arrive behavior felt right before scaling up, because arrive is the foundation everything else builds on. At this stage it was just one ellipse with a triangle tail, but the deceleration as it approached the cursor already felt organic.

The main challenge here was getting the arrive slowing to feel natural rather than mechanical. My first attempt applied the speed reduction too early, so the fish would crawl painfully slowly from far away before even reaching the target zone. I had to narrow the slowdown window to only the final 100 pixels, leaving full speed everywhere outside that range, before the motion started to feel right.

Phase 2 — Scaling to a shoal, adding flocking

I scaled from one fish to 40 using a loop and implemented separation, alignment, and cohesion as three separate steering forces composed together.

Tuning the force multipliers was the hardest part of this phase and took the most iteration. My first set of values had separation too weak, so fish constantly clipped through each other and the group looked like a blob rather than a school. Raising it too far in the other direction made the shoal explode outward and never reform. I also had alignment weighted too heavily early on, which made every fish lock into the same heading so rigidly that the group moved like a marching band rather than a living thing. Getting to sep × 1.8, ali × 1.0, coh × 1.2 took many small adjustments and a lot of watching the sketch run.

Phase 3 — Predator, flee, wander, and color states

The predator changed the whole feel of the sketch. Once a dark fish that tracked the mouse was introduced, the shoal became reactive rather than passive. Flee was implemented as a reversed seek: instead of steering toward the target, the fish steers directly away from it with a force multiplied at 2.5× for urgency.

The flee force caused a problem I did not anticipate. With the multiplier too high, fish near the predator would accelerate so violently that they shot across the entire canvas in a single frame and wrapped around to the other side, which looked completely wrong. I had to pair the force multiplier with a capped maxSpeed boost rather than an uncapped acceleration, so the urgency comes through in the speed increase without the motion becoming physically implausible. Getting the flee radius right also took several attempts. Too large and fish were permanently panicking even when the predator was nowhere near them. Too small and the reaction looked delayed and unconvincing.

Phase 4 — Flow field, ocean background, full fish anatomy

The final layer was a Perlin noise flow field that gives the whole canvas a gentle ocean current. Each fish looks up the flow vector at its position and applies it as a weak additional force.

The challenge here was finding the right weight for the flow force relative to the flocking forces. In early versions I had it too strong, and it completely overrode the cohesion and alignment behavior. Fish stopped schooling and just drifted in the same direction like debris, which defeated the whole point. Pulling it back to maxSpeed × 0.6 with a force limit at half of maxForce made it feel like an environmental influence rather than a controlling force — the fish push through it, but you can see them drifting slightly when nothing else is competing for their attention.

Reflection and Ideas for Future Work

The biggest surprise was how little code produces this result. The entire behavioral system, 60 fish with four distinct modes, comes down to three vector additions per frame per fish, each about five lines long. What Reynolds figured out in the 1980s still feels almost unreasonably powerful.

The Ghanaian flag palette was a natural choice for me. I had already used it in my midterm project, and it works well here because each color carries cultural weight while also reading clearly as a data signal. You understand the system’s state at a glance just from the color distribution across the canvas.

Ideas for future work include adding a real food source, a glowing anchor point the shoal seeks when the predator is far away, which would complete the full foraging cycle. Two competing shoals in different colors racing for the same food would also be compelling. Letting individual fish leave a faint pheromone trail that gradually fades would make the paths the school carves through the water visible over time. And sound would add another dimension: low ambient ocean audio that pitches up slightly when the shoal is stressed and fleeing.

References and inspiration:

  • The Nature of Code, Chapter 5 (Autonomous Agents) by Daniel Shiffman, which provided the steering force formula the entire system is built on
  • Craig Reynolds, Steering Behaviors for Autonomous Characters, the original separation, alignment, and cohesion framework
  • Braitenberg Vehicles, and the idea that complex behavior can emerge from extremely simple rules
  • Personal memory of fishing communities in the Central Region of Ghana

MIdterm Project – Adinkra Particle System

Midterm Project Overview

This project is a generative art system built in p5.js, centered on three Adinkra symbols from Ghana. Adinkra are visual symbols created by the Akan people. Each one encodes a philosophical proverb, a value, or a worldview that has been passed down through cloth, pottery, and architecture for centuries. Growing up Ghanaian, these symbols have always been part of my visual landscape. For this project, I wanted to bring them into a computational one.

The core idea is this: the symbols are invisible. There are no outlines drawn on screen. Instead, each symbol exists as a mathematical force field — a set of curves that attract particles toward them. Particles are born on the symbol’s edges, drift away through Perlin noise turbulence, and are pulled back by physics-based forces. What you see is not a drawing of the symbol. It is the symbol’s behavior, made visible through collective motion.

The system has four modes, switchable by pressing 1, 2, 3, or 4. Each mode corresponds to a different symbol or combination of symbols, with a distinct color palette drawn from the Ghanaian national flag: red, gold, and green on a black field.

Initially, the system had four modes built on particle-maze navigation — goal attraction, wall repulsion, and turbulence fields. The final version takes a completely different direction: instead of walls shaping particle paths from the outside, the symbol’s own geometry becomes the invisible attractor. The maze is gone. The symbol is the maze.

The Three Symbols

Choosing which Adinkra symbols to use was not a technical decision — it was a personal one. I needed symbols I could relate to and explain with honesty, not just describe. These three are the ones I keep returning to.

Mode 1 — Sankofa

“Se wo were fi na wosankofa a yenkyi” — It is not wrong to go back and retrieve what you forgot.

Sankofa exists in two visual forms. The one used in this project is the abstract heart form — the version stamped on cloth, carved into gold weights, and worn on ceremonial fabric across Ghana. The symbol is a heart body — two lobes that sweep down and meet at a pointed base — but what distinguishes it from a plain heart are the spirals. At the very top, where the two lobe lines meet in a V, each side continues past that meeting point and curls inward into the heart’s own interior. The left line curls down-right in a clockwise spiral, the right line curls down-left counter-clockwise. These inner spirals nestle inside the heart. At the bottom of the heart, flanking the pointed tip, two smaller spirals curl outward — away from the body, like feet planted on the ground. The whole symbol is bilaterally symmetric and deliberate in every curve.

As a Ghanaian studying abroad, this symbol means something specific to me. The further I move from home — geographically, culturally, academically — the more I feel the pull of that backward glance. Sankofa is not about being stuck in the past. It is about knowing what to carry with you.

In the system, Sankofa is rendered in red and gold — blood and heritage. Particles spawn across the full outline: both heart lobes, the two inner V-extension spirals, and the two outward bottom spirals. Perlin noise pushes particles away from the outline. A physics force pulls every particle back toward its travelling target point on the symbol. That constant tension between leaving and returning enacts the proverb directly in the particle physics.

Mode 2 — Gye Nyame

“Gye Nyame” — Except God. A declaration of the supremacy and omnipotence of God.

Gye Nyame is the most widely used Adinkra symbol in Ghana. You see it on walls, on fabric, on the backs of tro-tros, carved into doorframes, and on the Ghanaian 200 cedi(currency) note. It is not affiliated with any single religion — it expresses a universal acknowledgment that there is a force greater than human understanding. In Akan culture, Nyame is the origin and sustainer of all things.

The structure of Gye Nyame is unlike any other symbol. Running down the center is a chain of four alternating C-scroll knobs — bulging left, then right slightly lower, then left again, then right again — like the knuckles of a clenched fist stacked vertically. These give the symbol its distinctive textured spine. From the top of this spine, one large arm sweeps out to the upper-left in a wide arc, and its tip hooks back downward. From the bottom of the spine, a matching arm sweeps out to the lower-right and its tip hooks back upward. These two diagonal arms are not mirror images of each other across a horizontal axis — they are a 180-degree rotation of each other, which is why the symbol is described as chiral: it looks different from its own reflection. That diagonal asymmetry is the most identifiable thing about Gye Nyame.

In the system, particles spawn across all features — the four alternating spine knobs and both fishhook arms. Each particle travels along the outline continuously, with a sinusoidal oscillation displacing its target perpendicularly so the arms appear to breathe. The palette is gold and green — divine and natural, sun and land.

Mode 3 — Adinkrahene

“Chief of Adinkra” — greatness, charisma, and leadership.

Adinkrahene — the chief of all Adinkra symbols — is structurally the simplest: three concentric circles. Its power is architectural. It is said to have inspired the design of many other Adinkra symbols, which is why it sits at the head of the entire system. Simplicity as authority.

In the system, each of the three rings carries a different flag color: the inner ring is red, the middle ring is gold, and the outer ring is green. This mirrors the horizontal bands of the Ghanaian flag radiating outward from a center, the same way leadership radiates outward from a source. About 18% of particles are radiators — born at the center and travelling outward through all three rings before fading. They represent authority emanating from a single point.

Mode 4 — Composite

The fourth mode draws all three symbols at the same time using separate particle sub-systems. I wanted to experiment around it and see the outcome. The three force fields overlap and interact. Where Sankofa’s heart body overlaps with Adinkrahene’s inner ring, red particles from both systems cluster into unplanned concentrations. The symbols coexist the way traditions coexist, distinct but not isolated.

Implementation Details

The system is a single p5.js sketch organized into four layers: a mode system that handles keyboard input and configuration, a geometry layer that defines the mathematical outlines of each symbol, a physics layer that computes forces, and three particle classes — one per symbol — each managing its own movement, behavior, and rendering.

From the Progress Version to the Final Version

The progress version was a functional system built on maze navigation — particles moved through walls using goal attraction, wall repulsion, and Perlin noise turbulence. The technical foundation was solid. What it lacked was a conceptual anchor: the modes were mechanically distinct but did not say anything together.

The pivot to Adinkra symbols changed the project completely. Instead of walls shaping particle paths from the outside, the symbol’s own geometry became the invisible attractor. The maze walls were removed. The physics stayed. The symbols became the maze.

 Particle System

All four modes are built on a particle system. Each mode maintains a pool of 900 to 1,100 particles (2,400 in composite mode). Rather than destroying and recreating particles, the system calls reset() on a particle when it dies, recycling it with a new spawn position, velocity, color, and lifespan. This keeps memory usage flat and the frame rate stable throughout the session.

Every particle stores its previous position alongside its current one. Each frame, it draws a line segment from prev to pos before updating prev. This is what creates the motion trail. The trail length is controlled by fadeAlpha — the transparency of the dark wash applied over the entire canvas each frame. A lower value means longer, slower-fading trails.

// In draw() — dark wash creates motion trails
noStroke();
fill(0, 0, 7, fadeAlpha);
rect(0, 0, width, height);
// Inside any particle's show() method
show() {
  let a = (this.life / this.maxLife) * this.alp;
  stroke(this.hue, this.sat, this.bri, a);
  strokeWeight(max(this.r * (this.life / this.maxLife), 0.4));
  line(this.prev.x, this.prev.y, this.pos.x, this.pos.y);
  this.prev = this.pos.copy();
}

Forces and Newton’s Second Law

Particle motion is governed by F = ma. Each particle has a mass property. When a force is applied, it is divided by the particle’s mass before being added to acceleration. Heavier particles respond more slowly to the same force, which gives the system organic weight variation across the particle pool.

applyForce(f) {
  // F = ma  →  a = F / m
  this.acc.add(p5.Vector.div(f, this.mass));
}

In all three symbol modes, two forces act on every particle simultaneously. The first is a force toward a travelling target point on the symbol’s outline — this keeps the particle anchored to the geometry. The second is Perlin noise drift — this gives the particle organic, independent energy so it does not look mechanical. The balance between these two forces is what determines how tightly the symbol reads versus how alive the system feels.

update(sm) {
  // Advance t along the outline
  this.travelT += this.travelSpd * sm;
  if (this.travelT > 1) this.travelT -= 1;
  if (this.travelT < 0) this.travelT += 1;

  // Force 1: pull toward travelling target on outline
  let idx = floor(this.travelT * sankofaLUT.length) % sankofaLUT.length;
  let tgt = sankofaLUT[idx].copy();
  let toTarget = p5.Vector.sub(tgt, this.pos);
  let d = toTarget.mag();
  toTarget.normalize();
  toTarget.mult(constrain(d * 0.043, 0, 2.2));
  this.applyForce(toTarget);

  // Force 2: Perlin noise drift
  let na = noise(this.pos.x * 0.006, this.pos.y * 0.006,
                 frameCount * 0.003 + this.noiseOff) * TWO_PI * 2;
  let dr = p5.Vector.fromAngle(na);
  dr.setMag(0.11 * sm);
  this.applyForce(dr);

  this.vel.add(this.acc);
  this.vel.limit(3.2 * sm);
  this.pos.add(this.vel);
  this.acc.mult(0);
  this.life--;
}

Travelling Along the Outline (Orbital Motion)

The most important motion decision in the final version was the introduction of travelT — a normalized parameter (0 to 1) that advances along the precomputed outline look-up table every frame, at a random speed and random direction (some particles travel clockwise, some counter-clockwise). This is directly equivalent to how Adinkrahene’s ring particles advance their theta angle around the circle every frame.

Before this change, Sankofa and Gye Nyame particles only moved by being attracted toward a static nearest point on the outline. They jittered in place rather than flowing. Adding travelT gave them continuous directional motion along the symbol — the same quality that made Adinkrahene feel fluid.

// travelT advances along the LUT each frame — equivalent to
// theta advancing around Adinkrahene's ring.
// Random speed + random direction gives each particle
// independent orbital motion along the symbol outline.
this.travelT += this.travelSpd * sm;
if (this.travelT > 1) this.travelT -= 1;
if (this.travelT < 0) this.travelT += 1;

let idx = floor(this.travelT * sankofaLUT.length) % sankofaLUT.length;
let tgt = sankofaLUT[idx].copy();

Oscillation

Each particle has its own independent oscillation parameters: oscAmp (amplitude), oscFreq (frequency), and oscPhase (starting phase offset). Every frame, the particle’s target point is displaced sinusoidally perpendicular to the outline — so the symbol appears to breathe in and out rather than holding a rigid fixed shape. Because every particle has a different phase, the breathing is organic and asynchronous across the full outline.

// Compute perpendicular direction to the outline at target point
let toTgt = p5.Vector.sub(tgt, this.pos);
let perp  = createVector(-toTgt.y, toTgt.x);
if (perp.mag() > 0.01) perp.normalize();

// Displace target sinusoidally — symbol breathes in and out
let osc = this.oscAmp * sin(frameCount * this.oscFreq * sm + this.oscPhase);
tgt.add(p5.Vector.mult(perp, osc));

Perlin Noise

Perlin noise is used across all four modes to add organic drift to particle motion. Unlike random(), which produces sharp, uncorrelated values, noise() produces smooth continuous fields that evolve over time. The noise is sampled in three dimensions: x and y from the particle’s position, and a time dimension from frameCount multiplied by a small constant. The third dimension makes the field evolve slowly so the drift changes character over time rather than holding a fixed direction.

Each particle has a unique noiseOff value assigned at spawn. This offsets its position in the noise field so no two particles ever follow the same trajectory, even if they start from the same point. Without this, all particles drift in the same direction at the same time, which looks mechanical rather than alive.

// 3D noise: x/y position + time + unique per-particle offset.
// noiseOff ensures no two particles share the same noise trajectory.
let na = noise(
  this.pos.x * 0.006,
  this.pos.y * 0.006,
  frameCount * 0.003 + this.noiseOff
) * TWO_PI * 2;

let dr = p5.Vector.fromAngle(na);
dr.setMag(0.11 * sm);
this.applyForce(dr);

Warmup and Speed Ramp

A warmup system was added to solve a practical problem: the particles move quickly at full speed, making it difficult to capture clean screenshots for the three required export images. When a mode starts (or resets), modeFrame is set to zero. Each draw call, speedMult is computed by mapping modeFrame from the range 0 to 600 (about ten seconds at 60fps) to the range 0.18 to 1.0. This multiplier is applied to every dynamic value in all three particle classes — travel speed, noise magnitude, oscillation frequency, and the velocity cap. The system starts at 18% of full energy and smoothly accelerates to full speed over ten seconds.

During warmup, the HUD shows a pulsing “BUILDING — press S to save now” message so the optimal screenshot window is always clearly signposted. Pressing R resets the warmup ramp at any time.

// In draw() — ramps from WARMUP_MIN (0.18) to 1.0
// over WARMUP_FRAMES (600) frames, then holds at full speed
modeFrame++;
let speedMult = map(modeFrame, 0, WARMUP_FRAMES, WARMUP_MIN, 1.0);
speedMult = constrain(speedMult, WARMUP_MIN, 1.0);
// Inside every particle's update(sm) — sm scales all motion
this.vel.limit(3.2 * sm);

Geometry: Precomputed Look-Up Tables

Each symbol’s outline is defined as a series of cubic Bézier curves, sampled once at startup into a flat array of p5.Vector points called a look-up table (LUT). Sankofa has five segments (two heart lobes, two inner V-spirals, two bottom outward spirals) sampled into 800 points. Gye Nyame has seven segments (four knob scrolls, two arm segments each built from three chained cubics) also sampled into 800 points.

Rather than recomputing Bézier geometry inside the draw loop every frame, particles simply index into these arrays. This is what makes the system performant enough to run 1,100 particles per mode at 60fps. The LUTs are rebuilt whenever the canvas size changes (on R or mode switch) so the geometry always scales correctly to the canvas dimensions.

// Evaluate a cubic Bézier at t, push result into arr (canvas coords)
function sampleCubic(arr, ax, ay, bx, by, cx_, cy_, dx, dy, n) {
  for (let i = 0; i <= n; i++) {
    let t  = i / n;
    let m  = 1 - t;
    let x  = m*m*m*ax + 3*m*m*t*bx + 3*m*t*t*cx_ + t*t*t*dx;
    let y  = m*m*m*ay + 3*m*m*t*by + 3*m*t*t*cy_ + t*t*t*dy;
    arr.push(createVector(cx() + x, cy() + y));
  }
}
// Example: building the Sankofa LUT at startup
// Each call samples one Bézier segment into sankofaLUT.
// All five segments (heart lobes, inner spirals, bottom spirals)
// are sampled once and never recomputed during the draw loop.
function buildSankofaLUT() {
  sankofaLUT = [];
  let k = K();             // scale factor = min(w,h) * 0.0031
  let nH = floor(LUT_SIZE * 0.30);  // points per lobe segment
  let nS = floor(LUT_SIZE * 0.08);  // points per spiral arc

  // Left heart lobe: bottom point → V at top center
  sampleCubic(sankofaLUT,
    0, 95*k,  -40*k, 70*k,  -82*k, 28*k,  -82*k, -20*k,  nH);

  // Left inner spiral: from V, curls down-right
  sampleCubic(sankofaLUT,
    0, -28*k,  12*k, -42*k,  36*k, -40*k,  38*k, -22*k,  nS);

  // ... (right lobe, right spiral, bottom spirals follow same pattern)
}

The screenshots below were captured during developmental stages. They show how the system evolved from a basic noise-driven particle experiment with no symbol geometry, to a single-symbol prototype, to the full multi-mode system with all three Adinkra symbols, their individual color palettes, and the orbital travelT motion system in place. Each stage informed the decisions that shaped the final version.

 

The Three Outputs

The three exported images below were captured during the warmup phase of each mode — when the particles are moving slowly enough for the symbol to read clearly, but with enough energy that the trails and motion feel alive rather than static. Each image was saved using the S key at the moment the composition felt most balanced.

Output 1 — Sankofa. Red and gold particles tracing the abstract heart body with the V-extension inner spirals nestled at the top and the two outward bottom spirals flanking the pointed tip.

Output 2 — Gye Nyame. Gold and green particles tracing the alternating C-knob spine and the two diagonal fishhook arms — upper-left hooking down, lower-right hooking up.

Output 3 — Adinkrahene. Three concentric rings in red (inner), gold (middle), and green (outer), mirroring the Ghanaian flag’s stripes. A radiator streak is visible crossing all three rings outward from the center.

 

Output 4 — Composite Image.

Sketch

Video Documentation

The video below demonstrates all four modes of the system in sequence. Modes are switched live using the keyboard.

 

Reflection

What Changed From the Progress Version

The progress version worked mechanically but had no conceptual anchor. Four modes of maze navigation — functional, but nothing to say.

Rebuilding around Adinkra symbols gave the project a reason to exist. These symbols are not decorations. They are compressed philosophy from my own culture. Making them the invisible architecture of a particle system felt like engaging with that tradition rather than just referencing it.

What Worked

The invisible symbol approach is more legible than expected. After twenty to thirty seconds, the shape reads clearly from particle density and trail patterns alone — no drawn outline needed.

The flag color assignment has real logic behind it. Adinkrahene’s rings being red, gold, and green — mirroring the flag’s stripes radiating outward — is not arbitrary, which makes it easy to write about honestly.

The travelT motion system was the most important technical decision. Before it, particles jittered statically near the outline. After it, they flow continuously along the symbol in both directions. That change made the whole system feel alive.

The warmup ramp solved the screenshot problem cleanly. The first ten seconds of each mode are naturally the best window — no extra configuration, just press S.

What Was Hard

Getting the symbol geometry right took the longest. Both Sankofa and Gye Nyame are complex shapes that resist clean Bézier approximation. Several versions were discarded. The hardest part was not the math — it was building enough visual understanding of each symbol to know when the approximation was close enough.

Gye Nyame required understanding that its two diagonal arms are a 180-degree rotation of each other, not a mirror reflection. That asymmetry — the chirality — had to be correct in the coordinates before the symbol read as itself.

The closest-point lookup was a performance problem. Running Bézier math per particle per frame at 1,100 particles tanks the frame rate. The precomputed LUT — sampling the outline once at startup, doing a flat array search every frame — fixed it.

Plans for Future Improvement

Mouse interaction — clicking creates a temporary repulsion force, particles push away then return. Sankofa’s meaning becomes physically interactive.

Audio reactivity — microphone amplitude mapped to speedMult so the symbols respond to sound. The global speed multiplier is already in place; connecting an audio input would be a small change.

More symbols — Dwennimmen (strength and humility) and Funtunfunefu (democracy) are both geometrically interesting and personally meaningful. New LUT geometry, same motion system.

Mode transitions — a dissolve instead of a hard cut to black. Old particles fade out while new ones spawn in, suggesting the symbols share the same world.

References

Adinkra — Cultural Sources

Rattray, R. S. (1927). Religion and Art in Ashanti. Oxford: Clarendon Press.

Willis, W. B. (1998). The Adinkra Dictionary. The Pyramid Complex.

Adinkra Symbols of West Africa. adinkrasymbols.org

Eglash, R., Bennett, A., Lachney, M., & Bulley, E. Adinkra Spirals. csdt.org/culture/adinkra/spirals.html — geometric analysis of logarithmic spirals in Adinkra symbols.

Technical Resources

Shiffman, D. (2024). The Nature of Code, 2nd Edition. natureofcode.com

p5.js Reference Documentation. p5js.org/reference

The Coding Train — Daniel Shiffman. Introduction Videos I.2–I.4. thecodingtrain.com

Visual Inspirations

Ghanaian Kente cloth — the red, gold, green, and black palette comes directly from Kente patterns and the national flag.

AI Disclosure

AI tools (Claude, Anthropic) were used for debugging geometry and performance issues, identifying the LUT optimization, reviewing force application logic, and assisting with drafting this documentation. All creative decisions and final code were done by me.

Buernortey – Assignment 7

Video of Inspiration

Why I Chose This Visual

At teamLab, visitors could pick up a pencil drawing of a butterfly, flower, or lizard, color it in, and slide it under a scanner. Seconds later, their drawing appeared on the floor, glowing, animated, and moving freely through all the other visitors’ creations.

I chose a butterfly pencil drawing and colored it yellow. Watching that specific butterfly appear on the floor and drift between everyone else’s drawings was unlike anything else in the installation. Every other room at teamLab was something you walked through. This one was something you contributed to. The floor felt like a collective painting that no single person made, a shared canvas where hundreds of people’s choices all coexisted at once. That feeling is what I wanted to recreate in code.

The Sketch

The sketch shows a yellow butterfly entering from the left edge of a glowing, color-shifting floor, drifting organically through a crowd of colored creatures, flowers, fish, lizards, and swirling light forms, all wandering autonomously in every direction.

My creative twist: In the real installation, the floor was one continuous shared projection and you had no control over where your drawing went. In my version, I gave each creature a fully hand-coded personality — fish have tails, dorsal fins, and an eye with a pupil; lizards have four legs, a wagging tail, and a snout; flowers rotate their petals slowly as they drift; swirls pulse with orbiting circles. Each type is drawn entirely with p5’s shape functions — no images. The background also constantly shifts between deep blue, purple, magenta, and teal gradients, with large soft blobs of colored light drifting across the floor to simulate the ambient projected pools of color that filled the room at teamLab.

Code I’m Proud Of

The two pieces of code I’m most proud of are the wander steering system and the bezier butterfly wings.

Every creature , including the butterfly, uses wander steering. Instead of moving in straight lines or bouncing off walls, each creature accumulates tiny random velocity nudges every frame. This produces natural, unpredictable paths that feel alive rather than mechanical:

// Wander: nudge direction slightly each frame
this.vx += random(-0.04, 0.04);
this.vy += random(-0.03, 0.03);

// Speed cap — keep drift gentle
if (abs(this.vx) > this.spd)       this.vx *= 0.97;
if (abs(this.vy) > this.spd * 0.5) this.vy *= 0.97;

// Soft vertical boundaries — no hard bouncing
if (this.y < height * 0.32) this.vy += 0.05;
if (this.y > height * 0.97) this.vy -= 0.05;

The butterfly wings use bezierVertex() : four control points per wing half, mirrored on both sides, with a sin() oscillation scaling the wing width to simulate flapping:

// Upper wing — bezier shape
fill(yw);
beginShape();
vertex(0, -s * 0.10);
bezierVertex(s*0.22, -s*0.85, s*0.95, -s*0.72, s*0.82, -s*0.08);
bezierVertex(s*0.48,  s*0.12, s*0.10,  s*0.04, 0,      -s*0.10);
endShape(CLOSE);

// Flap: scales wing width using sin() — makes wings open and close
scale(side * (1 + flap * 0.28), 1);

 

Milestones and Challenges

Drawing every creature in pure code: The first decision was to use no images at all. Every flower petal, fish tail, lizard leg, and butterfly wing is drawn with p5’s shape functions. This took the most time but felt true to the spirit of the installation: simple outlines brought to life by color and motion.

Getting the butterfly wings right:  The bezier control points for the wings required a lot of manual tuning. The upper and lower wing have different shapes and different amber tones, and the mirroring had to be handled carefully using scale(-1, 1) inside a push()/pop() block so the two sides stayed symmetrical.

The shifting background: The original dark background made everything look dim. The solution was a background that draws a full-height gradient every frame, blending between three RGB color stops that slowly transition through a series of blue, purple, and magenta palettes. Five large drifting light blobs were added on top to simulate the ambient projected pools of color from the real installation.

Challenge, Perspective on a 2D canvas: The real teamLab floor had true projection-mapped depth — creatures far away appeared smaller and more faded. In 2D p5.js this had to be faked with a vanishing-point grid where lines converge on a horizon point, horizontal lines spaced using a power curve for perspective, and a warm glow rising from the bottom of the frame. It reads as a floor but is not true 3D.

Challenge:  When all the creatures were first added at similar speeds, the result looked like a screensaver. The fix was differentiating speed ranges per type, swirls drift slowly, fish move quicker, and the yellow butterfly moves faster and more directionally than everything else. That hierarchy gives the butterfly a sense of purpose and navigation rather than just floating.

Reflection and Ideas for Future Work

The most surprising part of building this was how much of the experience at teamLab came from pacing rather than visuals. The gentleness, nothing crashing, nothing disappearing abruptly, everything drifting, was harder to code than any of the shapes. Getting the wander steering to feel calm required many small adjustments to speed caps and boundary forces.

What is still missing most is convincing depth. The real floor had a spatial quality where distance was clearly readable. My version is flat, and that flatness makes it feel more like a simulation than an environment.

Ideas for future versions:

  • Use p5’s WEBGL mode so creatures scale smaller as they move toward the horizon, matching real perspective depth
  • Add a coloring step and let the user pick a color for their butterfly before it enters the floor
  • Implement Boids flocking so similar creatures occasionally cluster and drift together, which happened naturally at teamLab
  • Add ambient sound, low electronic tones and soft wing-flutter audio, to complete the immersion

Buernortey – Midterm Progress

Midterm Project Overview

This project expands on my assignment 3 project, where particles navigated a maze using goal attraction, wall repulsion, and turbulence. The midterm version adds multiple modes to explore different particle behaviors: refined maze navigation, free-flow turbulence, oscillating attractors, and dual attractors. The aim is to create diverse visual outputs and experiment with particle interactions, motion patterns, and color dynamics.

Implementation Details

The system now has a mode-based structure, allowing easy switching between behaviors using key presses (1–4). Each mode has its own settings for particle count, trail transparency, force strengths, and colors. Particles have variable sizes and colors, with trails rendered dynamically. Goals can be static, oscillating, or dual, depending on the mode.

Currently, walls are only implemented in Mode 1, the refined maze navigation mode. This is intentional for the progress version because Modes 2–4 are focused on exploring other behaviors, such as turbulence fields and moving attractors, without the influence of walls. Walls will be added to all modes in the final version to enhance particle interactions and visual complexity.

The reason for keeping the previous code is that the core particle and force mechanics are solid, so the current version builds on that foundation while adding more modes, dynamic goals, color variations, and adjustable parameters.

Key code highlights:

  • Mode system for switching between particle behaviors.

  • Particle class with forces: goal attraction, wall repulsion, and turbulence.

  • Dynamic goal movement in oscillating and dual-attractor modes.

  • Adjustable parameters for particle appearance, motion, and trail transparency.

  • Walls implemented in Mode 1, with plans to expand to all modes in the final version.

Progress

Base Code(Assignment 3):


Current state:

Mode changes with number: 1, 2, 3 and 4.

Reflection

The system is modular and flexible, making it easy to tweak parameters and add new behaviors. Next steps include creating more visually distinct modes, experimenting with more complex attractors or obstacles, and improving color and trail effects to produce final high-resolution outputs suitable for A3 prints.

References