Assignment 8

Concept

For this assignment, I wanted to explore Autonomous Agents. I took inspiration from Craig Reynolds’ Steering Behaviors, specifically the idea that a vehicle can “perceive” its environment and make its own decisions.

My goal was to create a Living City. I designed a system of commuters or normal vehicles that dutifully follow a circular traffic path, and a Police car (the interceptor) controlled by the mouse. The commuters get out of the police car’s way as it approaches. The project explores the tension between order (Path Following) and chaos (Fleeing from danger).

Sketch

Process and Milestones

I started by building a multi-segment path. The biggest challenge here was the logic required to make the vehicles drive in a continuous loop. I used the Modulo operator (%) in my segment loop so that when a vehicle reaches the final point of the rectangle, its “next target” automatically resets to the first point.

At first, my vehicles were “shaking” as they tried to stay on the path. I realized they were reacting to where they are right now, which is always slightly off the line. I implemented Future Perception—the vehicle now calculates a “predict” vector 25 pixels ahead of its current position. By steering based on where it will be, the movement became much smoother and more life-like.

The most interesting part of the process was coding the transition between behaviors. I wrote a conditional check: if the distance to the Interceptor (mouse) is less than 120 pixels, the Commuter completely abandons its followPath logic and switches to flee. I also increased their maxSpeed during this state to simulate “panic.” Once the danger passes, they naturally drift back toward the road and resume their commute.

Code I’m Proud Of

I am particularly proud of the logic that allows the vehicle to choose the correct path segment. It doesn’t just look at one line; it scans every segment of the “city” to find the closest one, then projects a target slightly ahead of its “normal point” to ensure it keeps moving forward.

// Predictive Path Following logic
followPath(path) {
  // look into the future
  let predict = this.vel.copy().setMag(25);
  let futurePos = p5.Vector.add(this.pos, predict);

  let target = null;
  let worldRecord = 1000000;

  // scan all road segments for the closest point
  for (let i = 0; i < path.points.length; i++) {
    let a = path.points[i];
    let b = path.points[(i + 1) % path.points.length];
    let normalPoint = getNormalPoint(futurePos, a, b);

    // ... (boundary checks) ...

    let distance = p5.Vector.dist(futurePos, normalPoint);
    if (distance < worldRecord) {
      worldRecord = distance;
      // look 15 pixels ahead on the segment to stay in motion
      let dir = p5.Vector.sub(b, a).setMag(15);
      target = p5.Vector.add(normalPoint, dir);
    }
  }

  // steer only if we've drifted outside the lane
  if (worldRecord > path.radius) {
    this.seek(target);
  }
}

Reflection

This project shifted my perspective on coding movement. In previous assignments, we moved objects by changing their position; here, we’re moving them by changing their desire. It feels much more like biological programming than math.

I also noticed that instead of commuters giving way to the police car, it looks like the cars are racing cars on a track fleeing from the police as it approaches. I’ve left the final interpretation up to the reader’s imagination…

Future Ideas

  • I want to add a separation force so that commuters don’t overlap with each other, creating more realistic traffic jams.
  • Allowing the user to click and drag the path points in real-time, watching the agents struggle to adapt to the new road layout.
  • Integrating the p5.sound library to make the interceptor’s siren get louder as it gets closer to the vehicles (Doppler effect).

Midterm Project

Project Overview

This project explores a single question: can a particle system feel less like simulation and more like atmosphere?

I designed a generative system that behaves like a living cloud of cosmic dust. Instead of producing one fixed composition, the sketch continuously evolves and can be steered into different “cosmic events” through three operating modes:

  1. NURSERY — diffuse, drifting birth-fields of matter
  2. SINGULARITY — gravitational collapse toward a center
  3. SUPERNOVA — violent outward blast and shock-like motion

The visual goal was to create images that feel photographic and painterly at the same time: soft glows, suspended particles, and dynamic trails that imply depth.

Core Concept

The concept is inspired by astronomical imagery, but translated into a procedural language:

  • Birth (diffusion)
  • Attraction (collapse)
  • Explosion (release)

These are treated not only as physical states but as aesthetic states. The same particle population is reinterpreted through changing force fields and trail accumulation over time.

I was particularly interested in the tension between control (known forces, reproducible logic), and emergence (unexpected compositions and visual accidents).

This is what makes the output truly generative: the system is authored, but each frame remains open-ended.

What Changed from the Starting Sketch

The starting version established the core particle loop and 3 state controls.

However, it was still a prototype:

  • Only a single rendering style (point strokes)
  • Low-resolution/off-target export buffer
  • Background tint drift issues over time
  • No dedicated print workflow
  • No color interaction controls

The sketch went through a series of transformations and visual changes as I experimented with different styles and debugged:

Here, I tried to play with the visual style of the dust and also tried to make them move uniformly over the sketch in the Nursery mode using grid coordinates, only allowing a few particles in each grid. But as you can see, that led to some unexpected behavior where the particles repeatedly jumped between grid boxes, giving the impression of being clung to the edges of the grid, showing the grid pattern. I scrapped this idea altogether.

This is where I started to get the particles’ look right. I added halos around the particles for visual appeal. There was still, however, the issue of the trails permanently discoloring the background canvas, and so I looked into fixing that. The end result after some tweaks is what you see in the final sketch.

I did experiment a bit further, tweaking parameters to see if I could stumble upon something interesting. Here is an example of a variant I made this way:

Even though this looked striking, it wasn’t true to the vision of dust I set out with, so I settled with the final look.

The final midterm version is substantially expanded from the initial sketch:

A) Rendering pipeline upgrade
– Introduced layered rendering via `trailLayer` (screen) and `trailLayerHR` (high-res memory).
– Trails are preserved and faded over time independently of the base background.
– This allows longer atmospheric strokes without permanently destroying background consistency, as was happening before in the first iteration.

B) Visual quality upgrade
– Multi-pass glow rendering per particle (halo + mid + core).
– More controlled velocity damping and motion smoothing.
– Larger and more visible dust behavior for stronger material presence.

C) Interaction + composition upgrade
– Added a Color Shift slider (`0–360`) to rotate the palette across HSB space.
– Default keeps the original blue/purple range; slider supports warm/yellow/red variants for alternate print moods.

D) Output and print-readiness upgrade
– Export target standardized at 4800 × 3600.
– Save trigger implemented on S key.
– Timestamped filenames generated for iteration tracking.

Technical Implementation

4.1 Particle engine

The sketch uses a classic particle architecture:

  • `Dust` class with:
  • `pos`, `vel`, `acc`
  • `lifespan`
  • per-particle hue and size attributes
  • `applyForce()` for composable behavior
  • state-based force fields applied every frame

This combines core programming techniques from class:

1. Particle systems (object lifecycle + continual spawn/death)
2. Vector-based force simulation (`p5.Vector`, gravity/repulsion/drag)

It also uses:

– oscillation (`sin`) for shimmer dynamics,
– state machine logic for mode switching,
– off-screen rendering buffers for high-resolution output.

4.2 State logic

The state variable controls force behavior:

– NURSERY: noise-like drifting flow
– SINGULARITY: attraction toward cursor
– SUPERNOVA: repulsion blast from cursor zone

This keeps one core system while producing multiple visual outcomes from the same underlying model.

4.3 Trail architecture and background consistency

A key challenge was preserving a stable background while retaining long-lived trails.

Solution:

– render base color to main canvas each frame,
– draw particles to transparent trail layers,
– fade trails by reducing alpha over time,
– composite trail layer over base.

This decouples “world background” from “particle residue,” preventing muddy drift and preserving consistent color identity.

Final Outputs

A major shift in mindset was treating the sketch not just as a visual toy, but as a capture instrument for composition.

I began composing moments intentionally:

– waiting for dense flow regions,
– steering with cursor in singularity/supernova moments,
– selecting color shift and temporal build-up before capture.

This made each final export less like a screenshot and more like a harvested frame from a living system.

I particularly like the third one, which reminds me of the film Arrival (2016), where extraterrestrials use circular logograms like the one depicted in the third output to communicate visually. Try to see if you can achieve this in the sketch!

Video Documentation

Reflection

This project taught me that strong generative outcomes come from balancing three layers:

  1. Physics logic (how particles move)
  2. Render logic (how particles appear)
  3. Capture logic (how moments are preserved).

The starting sketch had the first layer; the final midterm became successful only after all three were integrated.

What worked best:

  • Robust mode switching with meaningful visual differences
  • Stable background + persistent trails
  • High-resolution pipeline aligned with print requirements.

What I would improve next:

  • Richer mode-specific rendering signatures (even stronger distinction per mode)
  • Static star-dust depth field layer

References / Inspirations

  • p5.js Reference — https://p5js.org/reference/
  • p5.js `createGraphics()` documentation
  • Astrophotography color references (nebula palettes)
  • Class lecture notes on particle systems, vectors, and oscillation

AI Disclosure

AI assistance was used during development for debugging rendering behavior, refining export strategy, and structuring documentation. All concept direction, aesthetic decisions, interaction design, and final selection of outputs were authored and curated myself.

Assignment 7

Inspiration and Concept

I chose to recreate the butterfly installation because of the profound way it handles the cycle of life. In the exhibition, the butterflies seem to move slowly, almost suspended in the moment, but before you know it, they are gone. It is a reminder of how fleeting time is.

Interestingly, the butterflies in the exhibit are affected by human interaction. On approaching them, they seemed to diffuse and on touching the space they were projected onto, they would fall, as if we’d killed them. I  incorporated this interaction in my sketch as well.

My twist on the original visual is that the end of one life becomes the catalyst for another. When a butterfly “dies” (is clicked), instead of just disappearing, it falls to the earth and seeds a new, different, but equally beautiful life: a flower. To me, this represents the idea that contentment comes from accepting the changing nature of things. Even when a specific chapter ends, it provides the nutrients for something new to bloom.

Sketch

Butterflies emerge from the forest floor and drift upward.

Touch: Click a butterfly to end its flight. Watch it fall and reincarnate as a swaying flower (you might need to scroll to see the ground in this blogpost).

Interact: Move your mouse near the butterflies or flowers to see them glow brighter. The butterflies will gently shy away from your cursor.

Milestones & Challenges

The first goal was to get the butterflies moving from the bottom to the top. I used Perlin Noise to give them a natural, fluttery motion. However, I immediately hit a snag: the butterflies started accelerating aggressively toward the left or right edges of the canvas instead of staying on an upward path.

The Fix: I implemented a velocity limit and a constant “lift” force. This kept their speed under control while ensuring their primary direction remained vertical.

Next, I had to handle the transition from flying to falling. This required a State Machine within the Butterfly class. I added a state variable (FLYING or FALLING). When the mouse is clicked near a butterfly, its state flips, gravity is applied, and the “wing flap” oscillation slows down to simulate a loss of vitality.

The final stage was the “twist.” I created a Flower class that triggers at the exact x-coordinate where a butterfly hits the ground. I also added Sensory Logic:

  • Fleeing: Butterflies now calculate a repulsion vector from the mouse.

  • Proximity Glow: Using the dist() function, both butterflies and flowers “sense” the cursor and increase their transparency mapping (alpha) to glow brighter as you get closer.

Highlight

I am particularly proud of how the butterfly “seeds” the flower. To maintain the visual connection, I pass the specific hue of the butterfly to the new Flower object. This ensures that the life cycle feels continuous; the “soul” of the butterfly determines the color of the bloom.

// Inside the main draw loop
if (b.state === "FALLING" && b.pos.y > height - 25) {
  // We pass the butterfly's unique hue to the new flower
  flowers.push(new Flower(b.pos.x, b.hue)); 
  butterflies.splice(i, 1); // The butterfly is removed
}

Reflection

This assignment taught me how powerful Additive Color Mixing (layering semi-transparent shapes) is for recreating the feel of a light projection. By using the HSB color space, I was able to match the neon, ethereal palette of the teamLab forest. Some potential improvements:

  • Life Span: I’d like to make the flowers eventually fade away as well, completing the cycle and allowing the ground to remain “clean” for new growth.

  • Collision Detection: It would be interesting if butterflies had to navigate around the stems of the flowers that the user has created.

  • Soundscape: Adding a soft, shimmering sound effect when a butterfly transforms into a flower would deepen the emotional impact of the interaction.

Assignment 5

Concept

On brainstorming for my midterm project, thought of mimicking nature with my sketches, as I’ve done for a few of my sketches so far. I was intrigued by particle systems, which I felt like I had actually unknowingly used in my ocean sketch. I wanted to do something with particle systems again, pairing it up with another concept we’ve covered in class. I debated between two ideas: a mycelium system, or a cosmic dust cloud. I settled on the cosmic dust idea because I felt like the prints or images coming out if it could be very painterly.

So the idea is a generative system that simulates the birth, motion, and destruction of cosmic dust. The goal is to create a painterly vacuum: a space that feels filled with fluid-like gas clouds. By combining Particle Systems with Newtonian Forces (Gravity/Drag) and Periodic Oscillation, I want to produce high-resolution prints that capture the shimmer of deep-space photography.

Sketch

Implementation

  • The Particle Engine: I’ve developed a Dust class that handles movement and lifespan. Each particle is aware of its age, allowing the system to constantly recycle itself and stay generative.

  • The Different States: I’ve established three distinct states: NURSERY (Noise-driven movement), SINGULARITY (Point-based gravity), and SUPERNOVA (Radial repulsion).

  • Oscillation: I integrated Sine waves into the strokeWeight of the particles to create a subtle “twinkling” or shimmering effect that mimics starlight.

The Frightening Part & Risk Reduction 

One of the core requirements for the midterm is having multiple distinct operating modes. I handled this by implementing a State Machine – a logic structure that allows the entire physics engine of the sketch to change based on a single variable (state).

In my draw() loop, the particles check the current state of the “universe” every frame, instead of moving randomly. Using an if/else if structure tied to the keyPressed() function, I can switch between NURSERY, SINGULARITY, and SUPERNOVA instantaneously. This was a challenge at first because I had to ensure that the particles didn’t just break when the forces changed; by using p5.Vector and applyForce(), the transition between modes feels fluid, as the particles’ existing momentum carries over into the new force field.

Additionally, one of the most uncertain parts of this project is ensuring that the complex, high-particle-count visuals can be exported at A3 resolution without crashing the browser. If I simply used saveCanvas(), the print would be blurry and pixelated.

I settled on implementing a p5.Graphics buffer (canvasBuffer). This allows me to develop the logic on a small, fast 800 * 600 canvas while maintaining a hidden, high-resolution print-ready canvas in the background. My current save logic triggers on the ‘S’ key:

function saveHighRes() {
  canvasBuffer.background(240, 80, 5);
  // Future implementation:
  // for (let p of particles) { p.drawToBuffer(canvasBuffer); }
  canvasBuffer.save("nebula_output.png");
}

Instead of saving the current frame, I plan to loop over the particles one final time and re-render them onto this larger 1600 * 1131 canvas. This upscaling is a critical risk-reduction step that gives me peace of mind for the final physical print.

Another frightening part is thinking about how I’m going to make this sketch interesting enough to be a final midterm piece. Right now, it’s a skeleton. However, the basic physics – the drag (friction) that makes particles feel like they are moving through thick gas, and the Oscillation (shimmer) that makes them feel like stars – are already providing a strong foundation. I’m trusting the process: once the math is solid, the beauty usually follows in the polishing phase where I’ll play with color gradients and force magnitudes.

Assignment 4

Concept

For this assignment, I wanted to explore how Simple Harmonic Motion can be used to recreate organic environments. I took inspiration from Memo Akten, specifically his ability to take rigid mathematical oscillations and turn them into fluid, life-like movements.

My goal was to create a painterly ocean. Instead of perfectly smooth curves, I wanted broken lines that feel like the faceted surface of water at night. To tie it all together, I added a breathing moon and a star field, creating a cohesive, glowing atmosphere.

Sketch

Process

I started by creating a Wave class that uses the sine function. At first, the waves were just single lines. The challenge here was getting the “broken” look; I achieved this by using a large xStep in my loop, which created sharp angles between my vertices instead of a smooth curve.

Inspired by a sample sketch from class that used overlapping circles to create a glow, I decided to apply a similar logic to my waves. I implemented a nested loop that draws each wave 30 times per frame. Each layer has a slightly larger strokeWeight and a very low alpha value. This additive layering is what gives the water its ghost-like, ethereal quality.

A major technical hurdle was a gap on the right side of my canvas. Because my xStep didn’t always land perfectly on the edge of the screen (600px), the shape would “break” and draw a straight line back to the start.

I fixed this by manually adding a final vertex at exactly width before closing the shape, ensuring the glow stayed consistent across the whole viewport.

Finally, I added a star field and a glowing moon. The moon uses the same SHM logic as the waves. Its outer glow pulses using a sin() function, making it look like it’s breathing or shining through a thin layer of haze.

Highlight

I am particularly proud of the Layered Glow Logic. It’s a simple loop, but it completely transforms the sketch from a flat math diagram into a piece of digital art. By jittering the y position slightly with i * 2, the glow “spreads” naturally.

// Layering loop for the glow effect
for (let i = 0; i < 30; i++) {
  stroke(255, 10); 
  strokeWeight(0.5 + i * 0.8); // Thickens the stroke for outer "blur"
  
  let glowCol = color(red(this.col), green(this.col), blue(this.col), 15);
  fill(glowCol);
  
  beginShape();
  // ... vertex calculations ...
  let y = (this.yOffset + i * 2) + sin(currentAngle) * this.amplitude;
  vertex(x, y);
  // ... 
  endShape(CLOSE);
}

Reflection and Future Ideas

This project taught me that nature in code doesn’t have to be random. By using pure trigonometry, I was able to simulate the rhythm of the ocean.

Future Ideas:

  • Perlin Noise Integration: I’d like to use noise to vary the amplitude so that “rogue waves” occasionally swell up.

  • Interactive Tides: Mapping the mouse position to the frequency of the waves, so moving the mouse makes the ocean “choppier.”

  • Twinkling Stars: Using a unique sine wave for every star so they shimmer at different frequencies.

Assignment 3

Concept:

Inspired by the painterly textures of schools of fish avoiding predators like sharks and the concept of “Mutual Attraction” from the Nature of Code, I developed a sketch imitating what you would see if you observed predators and prey in the ocean. My goal was to move away from literal physics simulations and create something that looks – well – cool. The sketch features a shimmering school of prey (teal brushstrokes) and a few large predators (dark purple shadows). The entire system is governed by mutual influence, where every body pulls on every other body, creating a complex, swirling choreography.

In this ecosystem, the movement is defined by a specific hierarchy of forces. The Prey are mutually attracted to one another to create a “schooling” effect, while Predators are naturally drawn toward the prey. To simulate life-like behavior, I flipped the gravitational pull into a strong repulsion whenever a predator gets too close to a prey object, causing the school to scatter in fear. Finally, I added territorial repulsion between the predators themselves; this ensures they don’t clump into a single mass and instead spread out to hunt across the infinite wrapping canvas (more on this later).

Sketch:

Highlight:

I’m particularly proud of how I used Mass to influence the Visual Design. Instead of just drawing a circle, the mass of the object dictates the strokeWeight of the brushstroke, and its velocity dictates the direction and length of the “paint” line:

show() {
  push();
  // Mass determines the thickness of the brush
  strokeWeight(this.mass / 2); 
  
  // Velocity determines the direction and length of the stroke
  let p = this.position;
  let v = this.velocity.copy().mult(4); 
  line(p.x, p.y, p.x + v.x, p.y + v.y);
  pop();
}

Process and Challenges:

I started by creating 100 small “Prey” objects with mutual attraction. This created a beautiful, tight-knit shimmering school.

Then, I introduced “Predators” with higher mass. Initially, they just drifted. I had to implement a Fear rule where the attraction force was multiplied by a negative number if the interaction was between a predator and a prey.

I then experimented with how the creatures should leave the canvas. I settled on Screen Wrapping, as it allowed the painterly trails to feel continuous and infinite, rather than being interrupted by a wall.

A major hurdle was that the predators would eventually find each other and merge into one giant dark smudge. Because they were heavy, their mutual attraction was too strong to break. I solved this by adding a rule that specifically makes predators repel each other while still being attracted to the prey. You can see how they clumped together in this recording:

Reflection:

In the reading from The Computational Beauty of Nature, Gary William Flake discusses reductionism, the idea that systems can be understood by their individual parts. By defining just three simple rules (Attract, Scare, Repel), a complex eco-system emerged on my screen. For future work, I want to explore Oscillation. I would love to make the fish shimmer or vibrate as they move, mimicking the way light reflects off fish scales in deep water. I could also address the clumping problem in the prey which becomes apparent if you hold them together with the mouse for a while. But that’s a fix for another day.

Sun Bleached Flies – Assignment 2

Concept

Inspired by the swarming, atmospheric soundscapes of Ethel Cain’s Preacher’s Daughter, I wanted to recreate the jittery and unsettling movement of flies in nature.

Rather than just having objects fly randomly around the screen, I wanted to simulate the urge of a fly to either be stationary on the ground or burst into a manic, erratic flight. I focused on manipulating acceleration to distinguish between these two states: the heavy pull of landing and the weightless jitter of flying in the air.

Sketch

Highlight

I’m particularly proud of the state-switching logic within the update() function. It allows the fly to exist in two completely different physical modes (perched vs. flying) and uses a conditional check to ensure that the transition to landing only happens when the fly is actually descending toward the ground:

// Only force landing if they are below the line AND moving downwards
if (this.isFlying && this.pos.y >= height - 20 && this.vel.y > 0) {
  this.isFlying = false;
}

This bit of logic was crucial because it prevents the flies from accidentally tripping over the ground while they are trying to take off.

Reflection

Initially, the flies were ghosting through the floor. Even though I had a ground line, they would keep flying below it because their state was still set to “flying.” When I first tried to fix the ground collision, the flies became permanently stuck. Because they started on the ground, the force landing rule triggered at the exact same time as the take off urge, pinning them to the floor. I solved this by adding a grace rule: the ground check only applies if the fly is moving downwards (vel.y > 0). This finally allowed them to pop off the floor and enter their erratic flying state properly.

This assignment made me think back to the Flake reading about how simple rules create complex behaviors. By just giving the flies a 1% chance to change their mind, the screen starts to look like a living swarm. In the future, I want to replace the black circles with a PNG of a fly and potentially add a buzzing sound (like what you can faintly hear in the song “Family Tree“) that increases in volume or pitch based on the magnitude of the fly’s acceleration.

I’d also like to try making them avoid the mouse, simulating a swatting reaction using repulsive forces.

Assignment 1

Concept

For the first assignment, I decided to make a random walker using the Levy Flight algorithm. Additionally, I tweaked the motion so that the walker has a small chance of making a huge leap instead of always making small leaps, just to make it a bit different. It reminds me of how fast moving objects seem to us (such as how the Flash or Quicksilver seem to move in movies). I also mapped movement to scale, where the distance of the jump determines the thickness of the walker’s line and the size of the line’s head. Just to make it visually more appealing, I also decided to change the color of the head based on where it is on the sketch, such that it creates a nice looking color gradient.

Sketch:

Highlight

Even though it’s pretty simple, I was proud of how I decided between small and big leaps using a probability threshold. Here is the code:

let r = random(1);
if (r < 0.05) {
  step.mult(random(25, 100)); // Big leap 5% of the times
} else {
  step.mult(2); // Small leap otherwise
}

Reflection (Milestones, Challenges, Ideas)

An important milestone was successfully implementing the map() function to link two different mediums: spatial distance and visual scale. One challenge was figuring out the right reset interaction, before I settled on simply using a mouse click to leave it up to the viewer.

This sketch could be expanded on by maybe only changing the head’s color when a big leap is made, to signify the walker moving into a new “environment”, like stages of its life, or seasons of a show. It could also be made more visually appealing, but I’m not your guy for that (not yet, at least).

 

Reading Reflection – Week 1

Flake’s argument that nature is “frugal” and reuses the same simple rules for everything feels like a massive leap for us to make as a species. While he uses examples like the self-similarity in snowflakes and ferns to show how simple rules create complexity , I think we’re still too early in our development to claim we’ve figured out nature’s “program.” Now I’m not a hard science expert, but I find it hard to buy into his bold claim that nothing in nature stands above computational processes. We’re constantly discovering new things that break our old rules, and it feels a bit arrogant to assume that just because we’ve invented math and physics that can simulate a duck or an ant colony, we’ve actually decoded the fundamental “code” of the universe.

I also have a hard time with Flake’s “Silicon Laboratory” metaphor, mostly because it feels like it strips away the actual weight of being human. He talks about how groups like ant colonies or gazelle species find “solutions” through multiplicity , and while it’s true that human societies developed faster when we started sharing knowledge, being part of a group isn’t always a perfect “computational” win. In real life, groups can lead to things like mob violence or, on a completely different coin, the deep grief of losing someone – emotional experiences that a computer simulation could never truly capture. Flake seems biased by his computer science background, seeing the world as a series of number mappings. It makes me wonder, if we reduce everything to simple rules, do we lose the ability to understand things like yearning or heartbreak that don’t follow a logic-based script?