Gesture Controlled Visuals

Creative Coding with p5.js & ml5
Client
All
Role
Designer / Coder
Year
2024
Gesture Controlled Visuals

Hand Gesture Controlled Abstract Visuals

VIEW LIVE LINK

The integration of intuitive interfaces in digital media is a burgeoning field, enhancing user interaction through innovative methods. This project proposes the use of ml5.js, a user-friendly machine learning library built on TensorFlow.js, to create a real-time hand gesture recognition system. This system will control abstract visual elements, providing an immersive and interactive experience for users. The application will enable users to manipulate visuals on a screen through simple hand movements, bridging the gap between traditional input devices and gesture-based interaction.

Resources

Mr Bean regression

Handpose_Webcam

No items found.

Final Project Progress Update Log

Log 1

  • Reference visuals found and modified
    • “noPersp01” by KomaTebe Link
    • “tw_moebius” by KomaTebe Link
  • Learned HandPose source code on ml5's website
  • Researched examples of Hand Pose Controls
    • Mr Bean regression
    • Handpose_Webcam
      • Event-Driven Programming: The use of the .on("predict", callback) method demonstrates an event-driven programming style. This method sets up an event listener that updates the predictions array every time new hand pose data is available.
      • Manipulation of Global Variables: The predictions variable is used globally to store the hand pose data from the handpose model.
      • Interfacing with Hardware: The example illustrates how to interface with hardware components (like a webcam) using createCapture(VIDEO) and process this input through a machine learning model. This is important for applications that require direct interaction with hardware sensors.
      • Graphics and Visualization: Drawing keypoints on the canvas using the ellipse function teaches basic techniques for visualizing data points.
      • User Interface Manipulation: By hiding the original video feed and displaying only the canvas (video.hide()).
      • Asynchronous Loading and Initialization: The modelReady callback function, which logs "Model ready!", indicates model loading copleted.
    • Rock, Paper, Scissors with Handpose and KNN
      • Interactive Machine Learning: Users train the k-Nearest Neighbors (kNN) classifier by providing hand gestures as input, which the model uses to learn and classify gestures like rock, paper, and scissors.
      • Dynamic Data Collection: The example demonstrates how to collect data in real-time from the user. Users add training data by performing gestures in front of the webcam and pressing corresponding buttons.
      • Real-Time Prediction: The system classifies hand gestures on the fly and provides instant feedback by displaying the corresponding emoji.
      • User Feedback and Instructions: The application provides instructions if no hand is detected in the camera's view, guiding users for better interaction.
      • Visual Feedback: It visually represents the keypoints detected on the user's hand, helping users understand how the model interprets their gestures.

Log 2

Attempt 1: MySketch v1
  • What I did
    • Added the default visual adapted from “tw_moebius” by KomaTebe Link
    • Added User Interface & style.css adapted from Rock, Paper, Scissors with Handpose and KNN
    • Added Zoom in&out function - IT WORKED! - I tweaked it a couple times but i didn't log it
  • My Next Steps
    • Add in the other visual
    • Figure out a way to make the transition smooth between the two
    • Tweak the two to make the background color / lighting consistent
Log 3

Attempt 2: MySketch v2

  • What I did
    • Added the NEW visual adapted from “noPersp01” by KomaTebe Link
    • Added a FULL SCREEN Button - but the fonts are not consistent in this version so i changed it in the style sheet on my next version
    • Tried to use a gesture to draw new visual but failed.
      • I tried to use the finger pointing up gesture to draw the new visuals but it did not work, so I changed it in my next iteration.
  • My Next Steps
    • Try to figure out how to draw the new visuals
Log 4
  • I did not save the TOO MANY attempts but I did the following
    • I switched to the Scissor gesture to draw the new visuals, it works a bit better but sometimes when i was just trying to perform the zoom in and out function it will trigger the draw new visual function
    • I fixed the fullscreen button's UI design so that it will match the other one and overall pleasure for my (and users') eyes
    • I changed the background color and lighting settings to make the two visuals consistent, that way when it transitions between the two it will not seem glitchy and unpleasant
    • I also added a window resize function for the fullscreen function

Log 5

Attempt No.I CANNOT COUNT: MySketch v3
  • What I did
    • The draw new visual function worked this time - I also encountered an error where I will have to keep the gesture for the new visual to stay on the screen without changing back and I realized the problem is with the logic so I fixed it.
    • The Zoom in and out function works pretty well this time.
  • My Next Steps
    • I want it to switch back to the default visual with another gesture.

Log 6

My Final Iteration!! Kara_Final

  • What I did
    • It took me a few attempts to figure what sets of gestures i should use for drawing the new visual and the default visual without the two affecting each's performance - and I finally landed on the palm up and down gestures, and used the orientation of index finger and alm to do it.
      • I also tried thumb up and down function but it will affect the zoom in and out function so I gave up on this.
      • I also tried adding thresholds for how many frames the gesture should hold before the function draws but it didn't work very well so I also did not move forward with this.
    • I changed the instructions to make it more clear to the users.
    • Also other small tweakings to make it more fun and pleasant. AND There goes my final project!!
  • **What I wish to do after the class ends **
    • I felt like there are more potentials to the hand pose controls than what I am doing right now (of course) and I wish to dig into it more.
    • Keep iterating on this to make it more smooth and add in more visuals and try out other ways to switch or play with the visuals or animations.
    • Keep learning ml5 it's fun.

No items found.

READY TO START WORKING ON Your next big idea?

Get in touch