Final Project Draft 2

I have made many changes and desions at this code compared to my first draft.

  • Environmental Interactions:
    • In your current code, you focus on a dynamic terrain system with random terrain generation, a controlled terrain system, and effects like fog and lightning. The terrain is influenced by chaos mode and mouse interaction.
    • In the new code, the primary focus is on three distinct environmental effects: fire particles, water waves, and air particles, all interacting with a fractal terrain grid. The user can dynamically add land or water tiles by dragging the mouse and can add fire, water waves, or air particles using keyboard interactions.
  • Terrain Generation:
    • The terrain in your current code is controlled by specific logic that includes fractal terrain and height changes with chaos mode, along with a fog effect on top of it.
    • The new code generates a simpler, grid-based terrain where each tile is randomly set to land or water. This terrain also responds to mouse interactions for adding land or water based on key presses.
  • Particle Systems:
    • In your current code, particles are part of the chaos system and affect the terrain, fog, and lightning, with an emphasis on visuals tied to chaos mode.
    • In the new code, particles are independent of the terrain and form distinct systems (fire, water, and air) that behave according to their type. They are displayed as independent entities that interact with the environment visually but not physically.

 

 

Next step:

  • Sounds
  • Physical interaction between the land terrain and the water terrain
  • something to represent why am I doing this

Draft 2 – Khalifa Alshamsi

Enhanced Interactivity and User Experience

  • Starting Menu:
    • Introduced a welcoming start menu with instructions, creating a clear entry point for the simulation.
    • Added a clickable “Start” button to initiate the simulation, improving usability.

 

  • Planet Type Selection:
    • Implemented a dropdown menu for users to choose from three planet types: Earth, Mars, and Venus.
    • Each planet type features unique visual characteristics, such as:
      • Earth: Oceans and patches of green land.
      • Mars: Red-orange desert-like appearance.
      • Venus: Golden tones.

 

  • Simulation Speed Control:
    • Added an adjustable slider to control the speed of the simulation in real time, ranging from 0.5x to 3x normal speed.
    • Enabled dynamic speed adjustments without disrupting the simulation.

 

Future Directions

  1. Planet-Planet Interactions: Implement gravitational forces between planets to simulate more complex orbital dynamics.
  2. Trail Visualization: Experiment with visual trails to trace orbits.
  3. Sound Integration: Adding a soundscape to complement the visual elements, such as tones triggered by planetary movements.
  4. Customizable Sun: This allows users to adjust the sun’s mass or position, creating different gravitational effects.

 

Final Project Draft 2-Bringing a Flower to Life

Concept:

The idea of this project is to use art and technology to make a flower come alive. It’s about creating a flower that moves, reacts to sound, and interacts with people through mouse movements. Instead of being just a simple object, the flower feels lifelike and connected to its surroundings.

This is done using an Arduino Nano, sensors, and p5.js. The flower sways like it’s in the wind, reacts to sounds, and changes based on where you move your mouse. The project shows how we can use technology to make everyday objects interactive and bring them to life in creative ways.

Setting up the Flower:

Setting up the flower was simple but really exciting. I started by 3D printing the parts. The petals are white with a hint of orange inside, which gives them a soft, natural look. The stem is bright green, making it feel like a real flower, and the base is black to keep everything stable and balanced.

Inside the base, I will add all the electronics. I will use an Arduino Nano, a microphone sensor, and an RGB LED. The microphone will listen to sounds, like clapping or blowing, and the flower will react by swaying more when it detects them. I will carefully hide all the wires and parts inside the base, so from the outside, it will still look clean and simple—just like a real flower.

Arduino Code:
The Arduino code will handle the hardware interactions, making the flower responsive to its environment. It will be uploaded to an Arduino Nano and will collect data from the microphone sensor. This sensor will detect sounds like clapping or blowing and send the data to the p5.js program. The Arduino will also control the RGB LED at the base of the flower, allowing it to change colors in response to user interactions.

The microphone readings will be processed and sent over a serial connection to the computer. This data will allow the p5.js code to adjust the flower’s sway based on the intensity of the sounds. The Arduino code will ensure smooth communication between the hardware and the software, acting as the bridge that connects physical inputs to the digital flower.

p5.js Code:
The p5.js code will bring the visual and interactive elements of the flower to life. It will process the data received from the Arduino and use it to control the flower’s movements and reactions. For example, when louder sounds are detected, the code will make the flower sway more dramatically, as if pushed by strong wind.

The petals will respond to mouse movements, with their size changing when the mouse moves up or down, and the RGB LED’s color changing as the mouse moves left or right. Perlin noise will be used to create smooth and natural-looking swaying motions for the flower and its leaves. The background will display a gradient that blends soft colors, like a sunset, to set a peaceful scene.

The p5.js code will integrate all these elements, creating a system where the flower not only looks alive but also reacts and interacts in real time. Together with the Arduino code, it will make the flower feel connected to its environment and user inputs.

 

Final Draft Progress 2 – Developing Concept

In the time from my previous draft, I actually made two more draft programs and solidified my concept. Essentially, I am taking inspiration from Nam-June Paik’s piece TV Garden. Essentially, using the algorithms from cellular automata and the game of life, the program will develop a garden filled with several different types of flowers. The user will be able to tweak some of the parameters for the game of life to change how the flowers will grow and die on the tile. I also plan to add some user interaction by changing the Hydra visuals, but these will be somewhat limited due to how Hydra visuals are made. The second sketch focused on using the webcam aspect from Hydra and creating visuals that are intended to look like an old RGB TV overlayed on the camera. There is a slider there to modify the glitchiness of the webcam visual. The third draft focuses on developing the concept, creating 3D flowers and a basic grid for the cellular automata aspect. Here are the two sketches:

The flowers are a complex shape created using beginShape() that plots each vertex via a sine and cosine graph mapped to polar coordinates. Each vertex is made four at a time, to create a rectangle. This is repeated throughout the algorithm until the full flower is created. I also had to map these coordinates to a UV map to allow the overlaying of textures onto the flower, which are Hydra visuals. There are also two other types of flowers that are created with essentially the same algorithm, although a little simpler since they do not have faces created, only the vertices. These vertex points then have either a 2D repeating letter or symbol laid onto it or a simple block, creating flowers that are made out of text and blocks. These serve to add more variety to the garden, and also lessen the reliance on the Hydra visual flowers, which are extremely resource-intensive for the program and slows it down significantly. The code for the Hydra flowers can be seen here:

class Flower {
  constructor(x, y, z) {
    this.origin = createVector(x, y, z+15);
    this.v = []; //holds all the vertices
    this.rows = 15; //this is essentially the "resolution" of the flower, its polygon count
    this.cols = 30;
    this.minX = 1000000;
    this.maxX = -1000000;
    this.minY = 1000000;
    this.maxY = -1000000;
    this.choice = random(1) //used to choose one of the hydra canvases randomly
  }
  generate() {
    for (let theta = 0; theta < this.rows; theta++) { //create the coordinates for each vertex
      this.v.push([]);
      for (let phi = 0; phi < this.cols; phi++) { //r essentially maps x and y to polar coordinates
        let r =
          ((60 * pow(abs(sin(((5 / 2) * phi * 360) / this.cols)), 1) + 200) *
            theta) /
          this.rows;
        let x = r * cos((phi * 360) / this.cols);
        let y = r * sin((phi * 360) / this.cols);
        let z =
          this.vShape(180, r / 100, 0.8, 0.2, 1.5) -
          200 +
          this.bumpiness(2.5, r / 100, 12, (phi * 360) / this.cols);

        let pos = createVector(x, y, z);
        this.v[theta].push(pos);

        this.minX = min(this.minX, x); //this will be used to find the min-max range for the sin and cos waves, which will be needed to map the vertices to a UV map
        this.maxX = max(this.maxX, x);
        this.minY = min(this.minY, y);
        this.maxY = max(this.maxY, y);
      }
    }
  }
  show(pg,pg2) {
    push();
    if(this.choice <= 0.5) {
      texture(pg)
    } else {
      texture(pg2)
    }
    translate(this.origin);
    scale(0.1)
    for (let theta = 0; theta < this.v.length; theta++) { //grabs a vertex and three adjacent ones, which will combine to create a rectangle face
      for (let phi = 0; phi < this.v[theta].length; phi++) {
        if (theta < this.v.length - 1 && phi < this.v[theta].length - 1) {
          beginShape();
          vertex(
            this.v[theta][phi].x,
            this.v[theta][phi].y,
            this.v[theta][phi].z,
            map(this.v[theta][phi].x, this.minX, this.maxX, 0, 1), //uv mapping for texture
            map(this.v[theta][phi].y, this.minY, this.maxY, 0, 1)
          );
          vertex(
            this.v[theta + 1][phi].x,
            this.v[theta + 1][phi].y,
            this.v[theta + 1][phi].z,
            map(this.v[theta + 1][phi].x, this.minX, this.maxX, 0, 1),
            map(this.v[theta + 1][phi].y, this.minY, this.maxY, 0, 1)
          );
          vertex(
            this.v[theta + 1][phi + 1].x,
            this.v[theta + 1][phi + 1].y,
            this.v[theta + 1][phi + 1].z,
            map(this.v[theta + 1][phi + 1].x, this.minX, this.maxX, 0, 1),
            map(this.v[theta + 1][phi + 1].y, this.minY, this.maxY, 0, 1)
          );
          vertex(
            this.v[theta][phi + 1].x,
            this.v[theta][phi + 1].y,
            this.v[theta][phi + 1].z,
            map(this.v[theta][phi + 1].x, this.minX, this.maxX, 0, 1),
            map(this.v[theta][phi + 1].y, this.minY, this.maxY, 0, 1)
          );
          endShape(CLOSE);
        } else if (
          theta < this.v.length - 1 &&
          phi == this.v[theta].length - 1
        ) {
          beginShape(); //the first and last values are disconnected, because an array overflow would happen, so we have to manually create a second shape that closes the gap
          vertex(
            this.v[theta][phi].x,
            this.v[theta][phi].y,
            this.v[theta][phi].z,
            map(this.v[theta][phi].x, this.minX, this.maxX, 0, 1),
            map(this.v[theta][phi].y, this.minY, this.maxY, 0, 1)
          );
          vertex(
            this.v[theta][0].x,
            this.v[theta][0].y,
            this.v[theta][0].z,
            map(this.v[theta][0].x, this.minX, this.maxX, 0, 1),
            map(this.v[theta][0].y, this.minY, this.maxY, 0, 1)
          );
          vertex(
            this.v[theta + 1][0].x,
            this.v[theta + 1][0].y,
            this.v[theta + 1][0].z,
            map(this.v[theta + 1][0].x, this.minX, this.maxX, 0, 1),
            map(this.v[theta + 1][0].y, this.minY, this.maxY, 0, 1)
          );
          vertex(
            this.v[theta + 1][phi].x,
            this.v[theta + 1][phi].y,
            this.v[theta + 1][phi].z,
            map(this.v[theta + 1][phi].x, this.minX, this.maxX, 0, 1),
            map(this.v[theta + 1][phi].y, this.minY, this.maxY, 0, 1)
          );
          endShape(CLOSE);
        }
      }
    }
    pop();
  }

  vShape(A, r, a, b) { //creates the downward curve on the z-axis towards the center
    return A * pow(Math.E, -b * pow(abs(r), 1.5)) * pow(abs(r), a);
  }

  bumpiness(A, r, f, angle) { //creates "bumps" on the flower petals, so they are not completely flat
    return 1 + A * pow(r, 2) * sin(f * angle);
  }
}

It is a very long algorithm that has to be called each draw() frame, so it ends up slowing the program a lot. This is more apparent the more of these flowers there are. I greatly lowered the polygon count of these flowers, but it still has framerate problems, albeit much less than before. Ideally, these would be lowered even more, but too much detail is lost if they are lowered more, and the shape will start not resembling a flower.

The game of life aspect I used is very similar to our class example, just modified to show the flowers on top of the cells that are alive. Because it would overwhelm the program, the next generation of cells has to be done manually, with the press of a button by the user. Because of the resource intensiveness of the flowers, the grid is very small, only 5×5, which creates uninteresting patterns oftentimes.

My next steps will be to try to see if I can optimize the flowers even more, to reclaim more framerate for the program, which will hopefully allow me to create a larger grid to use. I am going to try to experiment with p5.framebuffers instead of p5.graphics, which are supposed to have better performance when working with WEBGL. After that, I will add the user interaction, which will be the modifying of the game of life rules, and finally brush up on visuals. If I have the time, I want to also try and implement more Hydra visuals that the Hydra instances can cycle through, so there is more variety than the current two that are there. I am not quite sure how to implement this yet, so I will have to experiment somewhat. I don’t want to create too many hydra canvases running at once, which would slow the program even more, so I might instead implement some sort of state machine that will cycle through different visuals. I also wanted to add some grass or other greenery so the flat plane is not as visible, but I think I won’t be able to add this due to performance. The camera is also offset right now, and I need to move it so it is properly looking at the garden. I’ve been having trouble doing this as of now, but hopefully, it won’t be too difficult to find a fix. One last thing I am considering is adding TVs of some sort, to further show how this program’s influence from TV Garden. This is a low priority for now though.

Final Project Draft #2

After looking more into cellular automata, I have realized that it is very restrictive and my project won’t work with it. So, I have tried to use fractals instead.

This was the result:

 

At first, I wanted one fractal that expands depending on where the mouse is, but somehow that did not work and it came out like this. I like it because it gives the “nature” vibes and it makes me think of multiple things that we find in nature.

Final project draft 2

For my second draft, I improved my sketch further by controlling the flow field with the mouse position rather than perlin noise. I also added a boat image just for visualization purposes.

Sketch:

Challenges:

    • Getting the flow field to leave a trail without the boat repeating itself ✅
    • Have a fading effect for the flow fields ❎

Things to add:

  • boat created with p5.js (bezier curves and decorated with fractals)
  • more realistic water background with ripples
  • fix edge issue

Final Project Drat 2

The changes I have made are mostly visual and I also changed the hard-coded word generation into a real-time one using the data muse API. Most of the changes made are visual. I used CSS and vanilla JS to improve the side style of the game. But I would like to include more mechanics and also work on adding more visually aesthetic assets such as using a proper jar instead of a rectangle.

Neon Doodle World – Final Project Draft 2

Progress Made
In this version, I made several improvements to enhance the sketch’s usability and interactivity. Here’s what has been implemented so far:

  • Color Palette: Users can now select specific colors from a palette instead of cycling through them randomly.
  • Toolbar Buttons: The interface now includes clear, save, and toggle drawing buttons, making interactions more intuitive.
  • Pause and Resume Drawing: The toggle drawing button lets users stop and resume drawing as needed.
  • Improved Introduction Page: I added an intro page that welcomes users and explains the sketch’s functionality before they start.

These updates make the project more polished and user-friendly compared to the first draft. The toolbar and the color palette, in particular, provide users with better control over their drawing experience.

Embedded Sketch

Next Steps for the Final Version
For the final version, I plan to focus on:

  • Undo Feature: Adding a way for users to undo their last drawing action for better control.
  • Save with Webcam Overlay: Enhancing the save functionality to include both the webcam feed and the doodle layer in the output.
  • Neon-Themed UI: Updating the interface with a neon design for a more visually engaging experience.

These improvements will further refine the project, making it more interactive and visually appealing.

Final Draft 2 by Dachi (Update)

I removed the unspiraling effect as I thought it took away from the interaction.

I added some noise and contrast adjustment to mimic horror aspect of the manga a bit more.

I added a sort of another phase warp which has acceleration component to it and kicks in after 200 frames to mimic spiral behavior from the anime.

+ some other changes with canvas and general codebase.

 

Sketch Updated: p5.js Web Editor | Uzumaki v2

Final Project Draft 2

Updates

I added a start button to the sketch to better organize it. I also added a tree, using fractals, and I intend to have the birds interact with the tree.

Embedded Sketch