Code and Design toolbox for p5js

Below are different examples to get you started with understanding how different elements work. They are intended to be bits and pieces for you to experiment with and combine.  

Make sure to also look at the P5.js website, specially look at the reference and the examples

More code examples can also be found on the examples page

Set colors

GUI example/base sketch

This little sketch has been set up to have the best practices for configuring p5js for apps and small installations. It includes:


A bit about colors

Programmers have a tendency to pick really ugly colors. Don't be that programmer. Use google color picker to pick colors for your experiments.  The color system is based on Red, Green and Blue, so copy the RGB values in the picker.

When you start combining multiple colors, you can use adobes color trends to learn from the best when it comes to color combinations.  I am using the one above this tutorial. 

Coding with colors

Color are Red, Green, Blue from 0 to 255. By mixing those three colors you can get more than 16 million different colors.

If you want the background to be blue then write:

background(0,0,255);

If you want to fill an ellipse with a semitransparent red color then add another parameter for fill like this:

fill(255,0,0,150);

ellipse(40,40,50,50);

If you want to set the stroke and not have fill then do the following

stroke(255,0,150);

noFill();

ellipse(40,40,50,50);

Drawing app

Basic drawing example. By drawing an ellipse where the mouse is every frame we can draw on the canvas:

 ellipse(mouseX, mouseY, 20, 20);

The reason this works is that we are not clearing the frame for each draw with background(0);

However, this makes for a very dotted drawing app. To get a continuous line we need to draw a line between the last position of the mouse (pmouseX/Y) and the current position (mouseX/Y) like so:

line(pmouseX, pmouseY, mouseX, mouseY);

More info about line, pmouseX and mouseX

This app is a good starting point for experimenting things you can do: Change the color of the line, combine it with the change color below, combine it with the pose app further down and add randomness to make more interesting things.

Change the color on mouseover

Use dist() to calculate the distance between two points. Combine this with an if statement and you can change the color when the distance to the mouse is within the circle size.

  if (dist(mouseX, mouseY, 240, 240) < 150 / 2) {

    fill(255, 0, 0, 150);

  } else {

    fill(209, 157, 44, 70);

  }

https://p5js.org/reference/#/p5/if-else 

https://p5js.org/reference/#/p5/dist 

Use the Google font library

Another programmer mistake is to not be mindful of the fonts they are using. It is common to just use the default font or a random font.  A good place to start is to look at googles fonts and see if you can find one that matches your concept. This example uses two fonts loaded directly from googles font library. To use this do the following:

Picking the right font for your project is a science in itself read more here.

https://p5js.org/reference/#/p5/textFont 

Be aware that this sketch has a method called loadGoogleFont in helper.js and index.html links to the webfonts.js. Both of those two needs to be in your project for it to work. Se the example for details.

Use a keyboard

You can use the function keyReleased() to register a keypress

function keyReleased() {

  if (key == 'a') {

   print("An A was pressed");

  }

}

Notice that the function "keyReleased()" is already present in the sketches. This is because it is used to full-screen when you press "f". So you only need to add the if statement within the outer curly brackets.

https://p5js.org/reference/#/p5/keyReleased

You can also make an if statement in your draw() function. Then it will do something until you release the key (e.g. moving a ball forward).

function draw() {

  if (keyIsPressed === true && key='a') {

  print("An A is pressed down");

  }

}

Notice that the function "draw()" is already present in the sketches. So you only need to add the if statement within the outer curly brackets to the existing draw function.

https://p5js.org/reference/#/p5/keyIsPressed

Play a sound sample

Playing sound samples is a quick way to make interactive experiences. To do so you need to upload your soundsample (see info to the left).  Then you need to load the sample in preload():

song = loadSound('punch.mp3');

To play the song call:

 song.play();

https://p5js.org/reference/#/p5.SoundFile

To play a sound you need to upload it to your sketch. Press the arrow and press upload. Try not to have any special characters and spacing in the filename.

Find sound effects here:

https://sound-effects.bbcrewind.co.uk/search?

Images

A bit about using images

Visually you have three strategies for making visual illustrations. You can use the code-based drawing tools (ellipse etc.), you can use a photo or you can use a vector illustration. They are easthetically very different and mixing them usually result is a very messy expression. So choose wisely and be mindful how the overall expression comes together.

If you want to design your own vector illustrations you can use  google drawing or inkscape. If you want manipulate photos gimp is a free editor. The illustration above has been made with google drawing.

First, upload the image. Then load the image in setup:

img = loadImage('star.svg');

Draw the image at a position:

 image(img, 40, 40);

https://p5js.org/reference/#/p5/image

Online image ressources.

Make sure to use free, creative commons or public domain images for your projects so you do not run into a copyright problem if you want to publish your project at a later stage. 

Get frequencies of sound

This example records sound from the microphone and calculates the frequencies of them. This can be used for many things. To vislusize noise levels or to react when certain frequencies get above a certain level.

The  call spectrum[0] will get the value of the first frequency and so forth.

Get audio volume

This example records sound from the microphone and calculates the amplitude of it. This can be used for many things. To vislusize noise levels etc.

amplitude.getLevel()

This returns a value between 0 and 1.

Bouncing balls example

This is a pretty advanced example, but it illustrates how you can simulate physics. Try to play with the parameters at the top to see how it changes the behaviour:

let numBalls = 20;

let spring = 0.05;

let gravity = 0.03;

let friction = -0.9;

The reason the balls have light trails is that the background is semi transparent. Based on code from Keith Peters. Multiple-object collision.

Pong example

This is a simple pong examle. It moves a ball around on the canvas. If the ball is within the area of the square the balls direction is reversed on the axis that collided.

Advanced libraries

Use A gamepad / joystick

A gamepad is a quite versatile input device that can be used as-is, and also be hacked into other form-factors with the right tools. It gives you a lot of analogue inputs to works with and some Arduino boards can also simulate a joystick.

When a gamepad is present you can get the different parameters with e.g.:

gamePads[0].state["LEFT_STICK_X"]

Other names for buttons and sticks: 

FACE_1, FACE_2, FACE_3, FACE_4, LEFT_TOP_SHOULDER, RIGHT_TOP_SHOULDER, LEFT_BOTTOM_SHOULDER, RIGHT_BOTTOM_SHOULDER, SELECT_BACK, START_FORWARD, LEFT_STICK, RIGHT_STICK, DPAD_UP, DPAD_DOWN, DPAD_LEFT, DPAD_RIGHT, HOME, LEFT_STICK_X, LEFT_STICK_Y, RIGHT_STICK_X, RIGHT_STICK_Y


Pose tracking for bodily interaction

Uses the camera to make a skeleton of a person and track different parts of the person. This is similar to Kinect tracking, but only uses the webcamera. To get one of the point it tracking use:

getLimpPosition(0, 0);

The first 0 is the person id and the second is the point on that person skeleton.



The drawing example uses a Graphics element to draw on read more here.

Face and mood detection

Artificial intelligence can be used to recognize faces and assess their mood. This can e.g. be used to make an Instagram filter.

The diagram to the right gives you the different points. For example, you can find the nose by using point number 41. You can get the individual positions and draw an ellipse with this code:

 ellipse(positions[42][0], positions[42][1], 20, 20);

You can get how angry a person is with this code:

predictedEmotions[0].value




From: https://github.com/stc/face-tracking-p5js

Based on this library: https://github.com/auduno/clmtrackr

Based on this paper: https://dl.acm.org/doi/10.1007/s11263-010-0380-4



Read more about face mesh here: https://google.github.io/mediapipe/solutions/face_mesh.html

Object detection library

Object detection libraries have ben taught to recognize object through a hupe library of images of different objects - the cocossd method has the following syntax:

for (let i = 0; i < detections.length; i++) {

    let object = detections[i];

    rect(object.x, object.y, object.width, object.height);

    text(object.label, object.x + 10, object.y + 24);

}

The for loop run through the objects detected in the array detections. For each object, it draws a rectangle around the object and adds a text label.

Be aware that the library takes a while to load.

Speech to text

Speech to text will listen to the audio input and convert it into text. It is not precise, but surprisingly good. You can then try to detect words and use them to do things. Right now it is responding to "kage". The small text is interim results and the large text is the final result.

Text to speech

This example plays a string of text. This way you can make interactions that are based on voice-based language instead of visuals etc.  To have the voice say "Husk at spise fisk" you would need to write:

playText("Husk at spise fisk");

Text to speech & speech to text

This is a combined sketch which allows you to listen to  voices and return a string based on keyword matches.

Projection mapping

Teach the machine to detect  different poses

In this example you record different kinds of poses and through machine learning it will recognize which poses you are in. 

Weather

Get the current weather information. The url returns a json file with different information:


{"coord":{"lon":12.0803,"lat":55.6415},"weather":[{"id":800,"main":"Clear","description":"clear sky","icon":"01d"}],"base":"stations","main":{"temp":4.45,"feels_like":-2.73,"temp_min":3.89,"temp_max":5,"pressure":1026,"humidity":41},"visibility":10000,"wind":{"speed":6.17,"deg":310},"clouds":{"all":0},"dt":1614951920,"sys":{"type":1,"id":1588,"country":"DK","sunrise":1614923558,"sunset":1614963240},"timezone":3600,"id":2614481,"name":"Roskilde","cod":200}

Go to https://openweathermap.org/ for more information.


Style transfer model

Use styletransfer to generate different styles from runway ml. You need to created your public hosted styletransfer url.

https://app.runwayml.com/model-collection/stylization

Draw on an image instead of the canvas

When things get more advanced then you often want to draw on an image instead of directly to the canvas. This allows you to send the image to machine learning, save the image and refresh elements on the screen without affecting the drawing you are making

Hand tracking

This example grabs an image from the webcam feed and uses machinelearning to detect a hand. The points can then be used to make interactive installations where one is not touching the interface and gesture based experiments.

https://editor.p5js.org/hobyedk/sketches/KTMbYaRDe



Text to points example

This little example converts text to points - so you can animate and make effects with text.

Capture an image from the webcamera

This example shows you how to capture an image from the webcamera. This is a good starting point for a photobooth, stopmotion or machine learning. Etc.

Make a noisy circle

Make a noisy circle

https://processing.org/tutorials/trig

Projection mapping

https://github.com/jdeboi/p5.mapper

Pick a color from a webcamera

This little example picks the color from a webcamera

Free music, video, samples images and clipart to use

Audio

Youtube has a free audio library that you can use for sound effects and background music etc.
An list of free music sites
A site with loads of free audio clips
Free cc clips

https://www.streambeats.com/

Video

Vimeo has a large collection of videos published under creative commons. In the link above we use the keyword "robot". Be aware that there are different CC license types and do make sure you use the right one in the search criteria. See/Read more here
A collection of free clips to use

https://www.dendigitalejournalist.dk/gratis-ressourcer/

Images

https://unsplash.com/

Clipart

Creative commons licenses

Make sure to understand the license you are using - see below for CC definitions.CC-BY-SA: ShareAlike. Anything that is licensed this way should be made available under the same license.CC-BY-NC: NonCommercial. You can’t use this file in anything that’s intended for commercial gain without written permission from the artist.CC-BY-ND: Attribution-NoDerivs. Any file released under this license should not be remixed in any way, including using it as a soundtrack to a video.https://creativecommons.org/

BACKED UP CODES OF THE DIFFERENT EXAMPLES ABOVE

#### INPROGRESS ####

Oscilliation patterns

PIE CHART

Detect gestures and objects


https://editor.p5js.org/hobyedk/sketches/9kimDQnVr



https://github.com/ml5js/ml5-library/tree/main/examples/p5js/StyleTransfer/StyleTransfer_Video

Style transfer

https://editor.p5js.org/hobye/sketches/S_Xor8D2k

https://yining1023.github.io/styleTransfer_spell/


https://blog.paperspace.com/creating-your-own-style-transfer-mirror/


https://www.amygoodchild.com/blog/curved-line-jellyfish?fbclid=IwAR2R21gdQYCaaPKlQCeiKA7EEsgl6-17-jdR14VIeed_8ybIUZXZzUITUUM

- [ ] 29/06/2021 mediapipe - handtracking

- [ ] 29/06/2021 https://mediapipe.dev/