Using Brainwaves for creative output // PCOMP Midterm
Oct 20, 2024
Inspiration
Our project emerged from a curiosity about visualizing mental states and emotions. We were inspired by the notion of capturing a "portrait" of someone's mind, using technology to convert intangible feelings into a tangible, visual form. The core concept was to create an artistic representation of a person's internal mental state using EEG sensor data from the Muse 2 headset.
Turning EEG into Art – Refik Anadol’s “Melting Memories”, Brain Wave Drawing Machine - Juanifico
Designing the Experience
In creating this project, the environment became a key factor. The Muse 2 headset requires a quiet and calm space to accurately capture EEG values, so we carefully designed the experience to ensure minimal noise or distraction. We wanted participants to be fully engaged with their thoughts during the process.
Experience Flow
A persons experience follows these steps:
The participant wears the Muse 2 headset.
They read a series of questions focused on their past, present, and future.
As they think about the questions, the headset records their EEG values for three minutes.
At the end of the session, the system automatically saves an image representation of their mental states.
When the participant presses a button, a printout is generated for them to take home.
This process allows participants to engage in deep, reflective thought while the EEG headset captures subtle shifts in their mental states. These variations in brain activity translate into a unique visual sketch.
Technical Flow
From a technical perspective, the process involves:
EEG data collection from the Muse 2 headset
Data processing in p5.js
State determination using ML5, trained on Jason's dataset
Sketch generation based on the determined mental state
PNG download triggered by an automator
Signal sent to Arduino IDE
Arduino IDE communicates with the thermal printer
Thermal printer produces the final receipt
Through this design process, we learned that while EEG captures brainwave patterns, it was important to focus on mental states—such as calm, focus, or meditation—rather than the raw data. These states represent an abstraction from the waves themselves, providing a more meaningful and interpretable output for the participant.
The Receipt Concept
Dan B O'Sullivan, a faculty member we consulted, introduced us to an alternative concept: instead of simply producing a sketch, what if the printout resembled a receipt—a kind of "accounting" of the participant's emotions? This added a playful element, blending the quantification of feelings with the formal structure of a receipt. The use of a thermal receipt printer for this purpose added humor to the notion of "auditing" someone's feelings.
This printed artwork would act as a keepsake, allowing participants to reflect on their mental state through abstract art. The tangible nature of the receipt provides a unique way for participants to take a piece of their experience with them, encouraging further reflection and introspection beyond the initial session.
By designing the experience in this way, we aimed to create a seamless interaction between the participant's thoughts, the technology capturing those thoughts, and the final artistic representation. This approach allows for a deeply personal and introspective experience, while also producing a tangible artifact that encapsulates a moment in the participant's mental journey.
Coding the Experience
We based our project on Jason Snell's MuseML tool, which simplifies the process of interpreting brainwave data from the Muse 2 headset. However, our journey to successfully implement this tool was not without challenges.
Jason Snells Initial Code and Sketch
Understanding EEG Data Processing
Initially, we struggled with capturing and interpreting the EEG values. To overcome this, we arranged a meeting with Jason, who explained the concept behind the whole process.
Initial meeting with Jason
Jason walked us through how raw brainwave values (beta, theta, etc.) are mapped and then converted into mental states. This understanding was crucial for us to move forward with the code implementation.
Code Implementation Challenges
After gaining a theoretical understanding, we began modifying the p5.js sketch. However, we encountered difficulties due to the multiple files within the sketch and variable naming conflicts. To address this, we requested a second code walkthrough with Jason.
During this session, Jason reviewed our code and helped us understand the interconnections between different parts of the sketch. To avoid variable conflicts, we adopted a naming convention that incorporated our names into the variables. For example:
Second meeting with Jason where he helped us write the code and implement changes
Iterative Development with AI Assistance
To further refine our code and understand its nuances, we utilized Cursor, an AI-powered coding tool. This allowed us to provide context to the AI and receive real-time assistance as we iterated on the sketch. Through this process, we were able to resolve issues and finally get the sketch working as intended.
Final Implementation
Here's the pseudo-code for our final implementation:
And here's a link to the actual code for the main visualization function: <LINK>
Visualization Evolution
Our visualization approach evolved significantly throughout the development process. We initially considered using a spiral graph with different stroke styles to represent different mental states:
Spiral graph visualisation idea iteration
However, we found this approach lacked legibility, especially given the small print size of our thermal printer. Inspired by the work of information designer Giorgia Lupi, we pivoted to using distinct symbols to represent different mental states:
Date visualisation precedents from Hiorgia Lupi
This oval visualization tracks mental states over 3 minutes. Each third represents 1 minute, with vertical symbol sets showing 6-second intervals. Symbols indicate mental states per the legend. The oval shape suits thermal printer output, allowing easy reading of continuous data.
This symbol-based approach maintained a level of abstraction above language while still being interpretable, allowing users to connect with their own thoughts and feelings about the prompts.
Through this iterative journey, we not only created a functional EEG visualization tool but also gained valuable insights into working with complex datasets, real-time data processing, and creating meaningful visual representations of abstract concepts.
Iterating with the Printer
We began by watching a YouTube tutorial on how to use a thermal printer with Arduino and the Arduino IDE. Our first task was to print simple text, and we wrote code that was supposed to print "Hello World" when the number 1 was pressed.
However, when we pressed 1, the printer outputted the number 1 instead of the intended text. Uncertain why this was happening, we sought help from David Rios during office hours, and with his guidance, we managed to resolve the issue to some extent, even successfully printing a QR code.
Printing errors with our first test
Challenges with Image Printing
The real challenge arose when we attempted to print an image. We were saving the sketches from p5.js as PNG files, but printing a PNG through the Arduino setup with the thermal printer proved to be far more complicated than expected. Despite several attempts and different iterations of the code, we reached a dead end on this front.
Trouble with the printer
Pivoting and Adapting
Given the challenges we faced, we decided to pivot our approach. We focused instead on refining the button functionality within our p5.js sketch based on info from this Github page. We successfully implemented two key features:
A button in the p5.js sketch that, when pressed, saved the generated sketch as a PNG.
A second button that triggered the printer to output something when pressed.
Here's some snippets of the p5.js code for saving the sketch and sending it to the prtiner:
Although the thermal printer wasn't able to print the actual PNG image, these buttons automated the process of capturing and printing, maintaining some of the interactivity we initially envisioned.
Final Solution
As a workaround to our printing challenges, we opted to use a larger printer for the final output. This allowed us to print the captured sketches accurately. While this change made the setup more imposing and resulted in A4-sized prints instead of small receipts, it allowed us to move forward with the project given our time constraints.
Since we could not solve the problem of Arduino recognizing the signals of the picture, we changed the idea and proceeded with the automatic printing function of Apple's computer. We used a computer USB connection to the printer. The principle is that when a person presses the button, p5.js will download the picture and save it in the "picture" folder, and then the file is automatically sent to print through the computer's automatic printing function.
Printing with a large format colour printer
The button presses still automated the process of capturing the sketch in p5.js, saving it as a PNG, and sending it to the larger printer for the final printout. This iteration helped us refine the interaction process and provided a functional, albeit adjusted, version of our original idea.
While our final setup deviated from our initial vision of a small, mysterious box with a thermal printer, it still maintained the core concept of creating tangible artifacts from participants' mental states. The larger prints, while less portable, offered greater detail and clarity in the visualization, potentially enhancing the reflective experience for participants.
This iterative process taught us valuable lessons about hardware integration, the importance of flexibility in design, and how to adapt our vision to technical constraints while still preserving the essence of our concept. It also highlighted the importance of considering alternative solutions when facing technical challenges, as our shift to using the computer's built-in printing capabilities ultimately allowed us to realize our project's core functionality.
The Final Experience
The culmination of our project is best demonstrated through video documentation of the entire process.
Jasons final Sketch
This video walkthrough showcases:
The quiet, meditative environment we created
A participant putting on the Muse 2 headset
Interactions with the p5.js sketch mapping EEG values in real-time
The final printing process
Each session lasted 3-5 minutes. Participant feedback often centered around how to interpret the data, highlighting a need for clearer guidance in future iterations.
Below are readings from our testing with Bethany, Julia & Callie -
Three additional users final sketch
Learning & Takeaways
Ethical Considerations
As we look to expand this project, it's crucial to consider the ethical implications of using brainwave data for interactive experiences. Privacy concerns, data security, and the potential for misuse of such intimate information must be carefully addressed. There's also the question of how prolonged use of such systems might affect a person's self-perception or mental state. As we continue to develop these technologies, we must prioritize user well-being and informed consent.
Looking Ahead
This project marks the beginning of our exploration into how EEG data and mental states can serve as inputs for creative and interactive experiences. One of the key insights we gained was the distinction between brainwave diagnostic data—such as alpha, beta, delta, theta, and gamma waves—and the abstraction of these waves into states of mind that people can control. By mapping these states (like calm or focus) to interactive systems, we realized that thinking itself can become a form of input.
This opens up fascinating possibilities. With machine learning models, we could train neural data patterns to respond to specific thoughts or repeated mental actions. Imagine training your brain to repeatedly think "forward," using that mental action to control movement in a video game. The brain becomes not just a source of diagnostic data, but an active, trainable input mechanism.
One idea we’re eager to pursue is a cell simulation game where one cell’s behavior is dictated by the game, while the other is controlled by the user’s brain activity. When the user focuses, the cell behaves as expected, creating a simple yet powerful connection between the user's mental state and the game.
A cell simulation concept experiment
Looking ahead, we see a multitude of applications for EEG data in both artistic and functional realms. Beyond games, this technology could be used to investigate and reflect on emotions, where brainwaves are transformed into physical or digital artifacts. For example, by turning someone’s emotional readings into a tangible object—like a 3D-printed sculpture—we can give people something real to hold, representing their internal thoughts and feelings. This idea of making the intangible tangible is a powerful way to bring people closer to understanding their mental and emotional states.
Overall, this process has been both challenging and deeply rewarding. We’ve learned a great deal about working with EEG data, understanding brainwave states, and integrating these concepts into creative technology. Looking forward, we are excited to continue working with EEG as a tool for introspection, interaction, and expression, exploring more ways to turn thought into action, and brain activity into art.
References
Jason Snell - for giving us his initial code, then helping us edit it (4 times). We really should list him as a contributor to this project.
Nikolai
ChatGPT
Claude
Cursor
An Kai Cheng
Haoren Zhong