Bodies as Instruments // ICM W9

Nov 9, 2024

Introduction

Before we begin here's John Henry popping off


For this weeks assignment to create an experience around the body I found myself drawn to creating an experience centered on dance, especially after spending time with dancers at ITP. There was something captivating about watching them move with such freedom and expression - it really made me think about my own relationship with movement and the challenges I've faced feeling comfortable in my own body. I wanted to build something that could help others connect with their physical selves in a gentle, encouraging way.

The technical foundation came together using ml5.js for pose tracking, combined with p5.js and MIDI capabilities. But at its heart, it's about creating this real-time conversation between body and sound - each movement generating audio feedback that feels natural and responsive. When you move, the system responds with sounds that acknowledge and support your exploration, creating this kind of safe space to experiment with movement.

I deliberately kept the instructions light - just enough to spark interest without being prescriptive. The whole point is for people to feel free to play and discover, focusing on the sounds they're creating rather than getting caught up in self-judgment about how they look or move. My hope is that people can get lost in this dialogue between movement and sound, finding their own way to express themselves physically in a space that feels supportive and playful.

An artefact visualizing movement-to-sound connection, layering dancers' body pose points over a photo with a musical stave.

Wrist as Instrument

Approach

Drawing from my observations of dancers the core concept uses pose detection to transform physical gestures into musical elements - essentially turning the body into an expressive instrument. I chose to use the pentatonic scale (C4, D4, E4, G4, A4) specifically because these notes naturally harmonize, allowing users to create pleasant sounds regardless of their movement patterns. This was crucial for maintaining a supportive, judgment-free space for exploration.

Concept sketch to begin experimenting with the concept
Pseudo code :
INITIALIZE SYSTEM:
    Setup video capture and ML5
    Initialize audio system:
        Create polyphonic synthesizer
        Set up pentatonic scale [C4, D4, E4, G4, A4]
        Add reverb effect (6s, 2 decay)

MAIN INTERACTION LOOP:
    Track body poses
    IF pose detected AND confidence > 30%:
        MAP left wrist position:
            Y-axis note selection
            Track speed trigger threshold
        
        MAP right wrist position:
            Y-axis volume control (0-1)
            Track speed trigger threshold
        
        IF movement speed > threshold:
            Generate corresponding note
            Apply mapped volume
            Add reverb effect

    UPDATE visualization:
        Mirror camera feed
        Draw skeleton overlay
        Show interaction points
First implementation testing:


The current implementation, though functional as a proof-of-concept, revealed key limitations in how we can meaningfully translate human movement into digital expression. These limitations, ranging from technical constraints to questions of accessibility, point towards exciting future directions for the project.

Current Key Limitations:

  1. The system only "sees" basic wrist positions, missing the full poetry of human movement

  2. Fails to capture the quality or emotion behind movements (gentle vs. strong, fluid vs. sharp)

  3. Mapping height to pitch oversimplifies movement, while a fixed scale and single synth limit creative depth and prevent sustained musical expression.

  4. Limited audio feedback reduces interaction immediacy, minimal visuals obscure movement-sound connections, and the system lacks adaptive response to movement patterns.


Addressing Constraints

In the previous section, I outlined some key limitations of the initial implementation, ranging from technical constraints to questions of accessibility. These limitations pointed me towards exciting future directions for the project, and I set out to make changes in an iterative way to address these concerns.

In the final interactive soundscape experience, participants are assigned musical roles based on the order they're detected. The first detected becomes the Beat/Rhythm Creator (Red), the second is the Bassline Controller (Blue), and the third is the Melody Weaver (Green). When someone leaves, the system monitors their absence and, if they return shortly, restores their role. If gone longer, a new detection assigns them a fresh role. New participants automatically receive the next available role, and if all roles are filled, they become passive observers (White). Roles are independent of position, and each person has a unique ID. The soundscape adapts to the number of participants, providing a beat solo for one, beat and bass for two, and a full experience with three.

Development Stages:

Stage 1: Base Structure & Role Assignment
- Set up global variables and role management
- Implement colored skeleton visualization
- Basic person tracking and role assignment
- Test adding/removing people

Stage 2: Individual Role Mechanics
- Implement Beat Creator mechanics
- Test and refine
- Add Bassline Controller
- Test interaction between roles
- Add Melody Weaver
- Test all three roles

Stage 3: Musical Integration
- Implement sound generation for each role
- Test sound interactions
- Refine movement-to-sound mappings

Stage 4: Synchronization
- Add movement synchronization detection
- Implement musical responses to sync

Testing Strategy:

  • Test each role individually first

  • Print debug information to console

  • Verify role assignments are working

  • Check movement detection accuracy

  • Test sound generation

  • Test multi-person interactions

I decided to take this systematic, step-by-step approach to build this complex system to avoid errors and scale complexity while ensuring each functionality works before adding on the next:

Role Management Tracking

In this first stage of iterative development, I focused on building a foundational system for assigning dynamic roles to participants based on the order in which they are detected. Here's how it works:

Role Assignment:

When someone enters the frame, they are assigned a role based on detection order:

  • The first person becomes the Beat/Rhythm Creator (Red skeleton).

  • The second person is the Bassline Controller (Blue skeleton).

  • The third person takes on the role of the Melody Weaver (Green skeleton).

Handling Entry & Exit:

  • If someone steps out, their role is "reserved" briefly, allowing for a seamless return. If they're gone for longer, their spot becomes available for a new person.

  • When a new person joins, they automatically take the next available role. If all roles are filled, they're assigned a passive "observer" status (White skeleton).

Core Features:

  • Position-Agnostic Roles: Roles aren't tied to screen locations; they're dynamically assigned based on detection order.

  • Unique Identifiers: Each participant receives a unique ID based on when they were detected.

  • Adaptable Soundscape: The system adjusts gracefully to the number of performers:

    • One performer creates a solo beat.

    • Two performers add bass, and three unlock the full experience.

To test this setup, I ran the code to confirm that role assignments work as expected:

  • Observed console logs for correct role assignment.

  • Checked the visual color cues on each skeleton to ensure they matched assigned roles.

  • Simulated people stepping in and out of the frame to confirm that roles reassigned correctly.

This base structure lays the groundwork for adding more interactivity in future stages.

// Setup
Initialize global variables: video, bodyPose, poses array
Define role definitions and role color constants
Initialize performer tracking variables: performers map, lastPerformerID

function setup()
    Create canvas and video capture
    Initialize bodyPose with ml5
    Log setup complete

function modelLoaded()
    Log pose detection model loaded
    Start bodyPose detection with gotPoses callback

function gotPoses(results)
    Update poses array with new results
    Call updatePerformers function with the new poses

function updatePerformers(detectedPoses)
    Create a new set to track seen performers
    For each detected pose:
        Identify the performer associated with the pose
        If no performer is found, assign a new one
        If a performer is found, update their data
    Check for any performers that have left and remove them
    Reassign roles if needed
    Log current performers and their roles

function assignNewPerformer(pose)
    Generate a new performer ID
    Determine the role for the new performer based on the current count
    If a role is available, add the new performer to the map and return the ID
    Else, return null

function determineRole(currentPerformerCount)
    Based on the current number of performers, return the appropriate role

function identifyPerformer(pose)
    Find the best matching performer from the existing ones
    Return the performer ID if a match is found, else return null

function calculatePoseDistance(pose1, pose2)
    Calculate the distance between the center points of the two poses

function getCenterPoint(pose)
    Calculate the center point of the pose based on the shoulder keypoints

function updatePerformerData(performerId, newPose)
    Update the last pose and last seen timestamp for the given performer

function checkForLeftPerformers(seenPerformers)
    For each performer in the map:
        If the performer was not seen in the latest update and has been gone for too long, remove them from the map and reassign roles

function reassignRoles()
    Sort the performers by their ID
    Update the role for each performer based on their new index

function draw()
    Draw the mirrored video feed
    If there are poses:
        For each pose:
            Identify the performer associated with the pose
            Get the role color for the performer
            Draw the pose keypoints and skeleton using the role color
            If a performer is identified, display their role above their head

Beat Tracker

For the Beat/Rhythm Creator (version 2.1), my goal was to create a more expressive and responsive drum/rhythm system that could be controlled through natural body movements. I wanted the Beat Creator to be able to generate a steady beat, add accent rhythms, and create variations - all controlled by the performer's vertical bouncing, arm swings, and hip movements.

The key movements I aimed to detect and map to musical elements were:

  1. Vertical Bouncing/Stepping:

    • This would serve as the core tempo, creating the main beat.

    • The intensity of the vertical movement would affect the volume of the drum sounds.

  2. Arm Swings:

    • Faster arm swings would trigger accent beats, adding more percussive elements.

    • The speed and size of the arm movements would control the intensity of the accents.

  3. Hip Movements:

    • Side-to-side hip sway would create rhythmic variations, adding more complexity to the beat.

    • The magnitude of the hip movements would alter the character of the rhythm patterns.

Pseudo-code for the Beat Creator Implementation:

// Initialize drum synths
setupBeatCreator()

// Update Beat Creator based on performer's movements
updateBeatCreator(pose, performer)
    if vertical movement > threshold and time since last beat > minimum interval:
        playBeat(performer.role)
        update last beat time and current beat
    if arm swing speed > threshold:
        playAccent(performer.role)
    if hip sway > threshold:
        playVariation(performer.role)

// Drum sound generation functions
playBeat(role):
    play kick drum sound

playAccent(role):
    play snare drum sound

playVariation(role):
    play hi-hat sound

The video above demonstrates the Beat Creator in action, showcasing how the performer's vertical bouncing, arm swings, and hip movements are translated into a responsive and expressive drum rhythm. This initial implementation lays the groundwork for the next stage, where we'll add the Bass Controller and Melody Weaver to create a more complete musical experience.

Beat Tracker + Bass

In the previous version (v2.1), we created a more expressive and responsive drum/rhythm system for the Beat Creator, allowing them to generate a steady beat, add accent rhythms, and create variations through vertical bouncing, arm swings, and hip movements.

We also introduced the Bass Controller role, where the performer could control bass notes and effects using knee movements and torso rotation.

For this version (v2.2), the key focus is on creating a more structured and cohesive musical experience by introducing the concept of predefined rhythmic patterns and basslines. Rather than generating random notes and rhythms, the performers will now control the selection and variation of these pre-composed musical elements, allowing for a more harmonious and musically coherent output.

Pseudo-code for Beat Creator and Bass Controller (v2.2):

// Define predefined beat and bass patterns
DEFINE BEAT_PATTERNS as array of drum patterns
DEFINE BASS_PATTERNS as array of bass patterns

// Set up tempo control variables
DEFINE tempo-related constants (MIN_BPM, MAX_BPM, DEFAULT_BPM, CHANGE_RATE)
INITIALIZE current BPM to DEFAULT_BPM

// Initialize state variables for beat and bass
INITIALIZE current beat pattern index
INITIALIZE current bass pattern index
INITIALIZE last beat time
INITIALIZE last bass time
INITIALIZE current beat position within pattern

// Update Beat Creator based on performer movements
FUNCTION updateBeatCreator(pose, performer):
    IF performer is the Beat Creator:
        CALCULATE vertical movement of performer
        MAP vertical movement to target BPM within MIN_BPM and MAX_BPM range
        SMOOTHLY TRANSITION current BPM towards target BPM
        CALCULATE beat interval based on current BPM
        
        IF time since last beat is greater than beat interval:
            PLAY current beat in the selected beat pattern
            UPDATE last beat time
            INCREMENT current beat position within pattern

// Update Bass Controller based on performer movements
FUNCTION updateBassController(pose, performer):
    IF performer is the Bass Controller:
        CALCULATE average position of performer's knees
        CALCULATE knee movement magnitude
        
        IF knee movement is greater than threshold and time since last bass is greater than beat interval:
            DETERMINE new bass pattern index based on knee position
            IF new index is different from current bass pattern index:
                UPDATE current bass pattern index
                PLAY the new bass pattern
                UPDATE last bass time

// Play beat pattern
FUNCTION playBeatPattern(beatIndex):
    RETRIEVE current beat pattern
    IF pattern has a kick drum sound at the current beat index:
        PLAY kick drum sound
    ELSE IF pattern has a snare drum sound at the current beat index:
        PLAY snare drum sound
    ELSE IF pattern has a hi-hat sound at the current beat index:
        PLAY hi-hat sound

// Play bass pattern
FUNCTION playBassPattern(patternIndex):
    RETRIEVE current bass pattern
    FOR each note in the pattern:
        PLAY the corresponding bass note

In this version, the main changes are:

  1. Introducing predefined beat and bass patterns that are stored in separate arrays (BEAT_PATTERNS and BASS_PATTERNS).

  2. Implementing a tempo control system that allows the Beat Creator to adjust the overall BPM based on their vertical movement.

  3. Modifying the Beat Creator to play the current beat in the selected pattern, rather than generating random beats.

  4. Allowing the Bass Controller to select different bass patterns based on their knee movement, creating more harmonious basslines.

The video demonstration for this version would show the performers using their movements to control the tempo, select the beat and bass patterns, and create a more structured and cohesive musical experience.

This updated implementation focuses on creating a more musically coherent system, where the performers can collaborate to build a harmonious composition by controlling the selection and variation of pre-composed rhythmic and bass elements.

Beat Tracker + Bass + Synth

In the previous version (v2.2), we focused on creating a more structured and cohesive musical experience by introducing predefined beat and bass patterns. The Beat Creator could control the tempo and select the beat patterns, while the Bass Controller could choose different bass patterns to accompany the rhythms.

For this version (v2.3), we're introducing the Melody Weaver as the third performer. The goal is to create a collaborative, layered musical experience where each performer contributes a distinct element to the overall composition.

The Melody Weaver will use hand positions to control the melody, with the left hand selecting notes from a pentatonic scale and the right hand adjusting the octave range. The distance between the hands will control the volume of the melody.

By having all three performers work together, the system will create a more harmonious and expressive musical output, where the individual contributions blend into a cohesive whole.

Pseudo-code for Beat Creator, Bass Controller, and Melody Weaver (v2.3):

// Pentatonic scale definition
const MELODY_SCALE = ['C4', 'D4', 'E4', 'G4', 'A4'];

// Tempo control
updateTempo(handHeight):
    map hand height to desired BPM within MIN_BPM and MAX_BPM range
    smoothly transition current BPM towards target BPM
    return beat interval in milliseconds

// Beat Creator
updateBeatCreator(pose, performer):
    if performer is the Beat Creator:
        calculate vertical movement of performer
        update tempo based on vertical movement
        get current beat interval
        
        if time since last beat is greater than beat interval:
            play current beat in the selected beat pattern
            update last beat time
            increment current beat position within pattern

// Bass Controller
updateBassController(pose, performer):
    if performer is the Bass Controller:
        calculate average position of performer's knees
        calculate knee movement magnitude
        
        if knee movement is greater than threshold and time since last bass is greater than beat interval:
            determine new bass pattern index based on knee position
            if new index is different from current bass pattern index:
                update current bass pattern index
                play the new bass pattern
                update last bass time

// Melody Weaver
updateMelodyWeaver(pose, performer):
    if performer is the Melody Weaver:
        get left and right wrist positions
        
        // Note selection
        calculate note index based on left wrist X-position
        
        // Octave control
        calculate octave shift based on right wrist Y-position
        
        // Volume control
        calculate distance between hands
        map hand distance to volume
        
        if time since last melody note is greater than minimum interval:
            play melody note with calculated pitch and volume
            update last melody time

The key aspects of this implementation are: Introducing the Melody Weaver as the third performer, responsible for controlling the melodic elements. Using hand positions to select notes, adjust octave, and control volume of the melody. Maintaining the tempo control system for the Beat Creator and the pattern-based bassline for the Bass Controller.

Coordinating the timing and interactions between the three performers to create a cohesive musical experience.

The video above demonstrates the complete three-performer system in action. The Beat Creator controls the tempo and rhythm, the Bass Controller selects and modifies the bassline, and the Melody Weaver weaves the melodic elements into the composition. This version builds upon the previous implementations, creating a more comprehensive and collaborative musical experience where each performer has a distinct role and contributes to the overall soundscape. CopyRetryClaude does not have the ability to run the code it generates yet.

Body as Instrument V3.0

In the previous version (v2.3), I introduced the complete three-performer system, where the Beat Creator, Bass Controller, and Melody Weaver each contributed distinct musical elements to the overall composition. The performers used their body movements to control the tempo, select and modify rhythmic and bass patterns, and generate a melodic line.

For this version (v3.0), I've made several key improvements to the system based on user feedback and my own observations:

  1. Improved Sound Design and Processing: I've enhanced the sound synthesis and signal processing for a more polished and impactful audio experience. This includes using layered drum sounds, adding more resonance and character to the bass, and applying reverb and compression to create a cohesive, professional-sounding mix.

  2. Streamlined Movement Detection: I've refined the movement detection thresholds and parameters to make the system more responsive and intuitive for the performers. This includes reducing the minimum required movement for triggering sounds, as well as improving the smoothness and accuracy of the movement-to-sound mappings.

  3. Expanded Performer Interactions: I've introduced new ways for the performers to interact with each other and the overall musical output. This includes the ability to synchronize their movements, which triggers special sound effects and variations, as well as the option to "pass" musical elements between performers to create a more collaborative experience.

  4. Visual Feedback and Guidance: To enhance the overall user experience, I've added more visual cues and guidance for the performers. This includes displaying the current note/scale being played, showing the intensity of the performers' movements, and providing visual feedback when synchronization is achieved.

These improvements aim to create a more polished, responsive, and expressive movement-based music system that encourages deeper exploration and collaboration between the performers.

Pseudo-code for Beat Creator, Bass Controller, and Melody Weaver (v3.0):

// Tempo and Synchronization
DEFINE tempo-related constants (MIN_BPM, MAX_BPM, DEFAULT_BPM, CHANGE_RATE)
INITIALIZE current BPM to DEFAULT_BPM
DEFINE SYNC_THRESHOLD for movement synchronization

// Beat Creator
updateBeatCreator(pose, performer):
    IF performer is the Beat Creator:
        CALCULATE vertical movement of performer
        UPDATE tempo based on vertical movement
        GET current beat interval
        
        IF time since last beat is greater than beat interval:
            PLAY current beat in the selected beat pattern
            UPDATE last beat time
            INCREMENT current beat position within pattern
            
        IF performer's movements are synchronized with Bass/Melody:
            APPLY special beat variation
            TRIGGER synchronized sound effect
            
// Bass Controller
updateBassController(pose, performer):
    IF performer is the Bass Controller:
        CALCULATE average position of performer's knees
        CALCULATE knee movement magnitude
        
        IF knee movement is greater than threshold and time since last bass is greater than beat interval:
            DETERMINE new bass pattern index based on knee position
            IF new index is different from current bass pattern index:
                UPDATE current bass pattern index
                PLAY the new bass pattern
                UPDATE last bass time
                
        IF performer's movements are synchronized with Beat/Melody:
            APPLY special bass variation
            TRIGGER synchronized sound effect
            
// Melody Weaver
updateMelodyWeaver(pose, performer):
    IF performer is the Melody Weaver:
        GET left and right wrist positions
        
        // Note selection
        CALCULATE note index based on left wrist X-position
        
        // Octave control
        CALCULATE octave shift based on right wrist Y-position
        
        // Volume control
        CALCULATE distance between hands
        MAP hand distance to volume
        
        IF time since last melody note is greater than minimum interval:
            PLAY melody note with calculated pitch and volume
            UPDATE last melody time
            
        IF performer's movements are synchronized with Beat/Bass:
            APPLY special melody variation
            TRIGGER synchronized sound effect
            
// Synchronization Detection
FUNCTION detectSynchronization(beatPerformer, bassPerformer, melodyPerformer):
    CALCULATE movement similarity between performers
    IF movement similarity is greater than SYNC_THRESHOLD:
        RETURN true
    ELSE:
        RETURN false

The key changes in this version include:

  1. Enhanced sound synthesis and processing for a more professional-sounding output.

  2. Reduced movement thresholds and improved responsiveness of the movement-to-sound mappings.

  3. New functionality for synchronizing movements between performers, triggering special effects and variations.

  4. Visual feedback and guidance elements to enhance the user experience.

The video above showcases the complete three-performer system in version 3.0, highlighting the improvements in sound quality, responsiveness, and the new synchronization features. The performers can now work together more closely to create a cohesive and expressive musical experience. (link to prototype)

Needs More Work

Through iterative development, I've created a more structured and cohesive musical experience for performers. The initial versions established a foundational system for assigning roles and implementing basic movement-to-sound mappings. However, the earlier versions had limitations, with sounds being generated too freely and resulting in less predictable, coherent output. To address this, I will introduce a structured approach in a future version, incorporating predefined rhythmic patterns and basslines that performers can control and modify.

By aligning musical elements to a common tempo and allowing performers to influence energy and variations, I will create a more harmonious, collaborative experience. Synchronization detection will also enable performers to work together closely, triggering special effects when movements synchronize. Additionally, I will enhance sound design, processing, and movement-to-sound responsiveness for a more polished, engaging output.

Next Steps:

  1. Expand the pattern library to allow greater musical diversity and evolving compositions.

  2. Implement chord progressions for the Melody Weaver, enabling more harmonically rich structures.

  3. Explore ensemble interactions, facilitating collaborative performances with synchronized, shared control.

  4. Incorporate generative music elements to introduce unpredictability and continuous transformation.

  5. Develop immersive visual elements that respond to performers' movements.

By continuing to refine and expand this movement-based musical system, I can create an even more engaging, expressive, and collaborative platform for performers and audiences to explore the intersection of physical movement and musical expression.

References
  1. Claude by Anthropic
  2. Ellen for being such a great teacher
  3. https://p5js.org/reference
  4. p5.js MIDI Library

  5. ml5.js Documentation

©2019-2025 SURYA NARREDDI.

©2019-2025 SURYA NARREDDI.