Project Overview
Inner Voice is an interactive wearable installation that explores the intersection of technology, emotion, and self-expression. Using biosensors and programmable LEDs, the piece responds to the wearer's physiological signals, creating a dynamic visual representation of their internal state.
The project challenges traditional notions of privacy and emotional display in the digital age, asking: What if our feelings were visible? How would this change the way we interact with others?
"Technology should amplify human experience, not replace it. Inner Voice creates a bridge between our internal and external worlds."
Circuit Design
ESP32 microcontroller with biosensors
LED Programming
WS2812B addressable LED strips
Fabric Integration
Conductive thread and flexible materials
Technical Implementation
Hardware Components
- ESP32 Microcontroller: Dual-core processor for sensor processing and LED control
- WS2812B RGB LEDs: 144 individually addressable LEDs per meter
- Pulse Sensor: Detects heart rate variability
- GSR Sensor: Measures galvanic skin response (emotional arousal)
- Flexible PCB: Custom-designed circuit board for wearability
Software Architecture
The system uses a state machine to interpret sensor data and translate it into visual patterns. Machine learning algorithms trained on emotional response datasets help categorize physiological signals into emotional states.
Design Challenges
Creating a wearable electronic textile required solving multiple engineering challenges: waterproofing components, ensuring flexibility and comfort, managing power consumption for 6+ hours of operation, and maintaining signal integrity in a body-worn context.
Full Demonstration
See Inner Voice in action
Exhibition & Reception
Inner Voice was exhibited at the New Media Art Festival in 2020, where it received positive responses from both critics and participants. Viewers were invited to wear the piece and experience the visualization of their own emotional states.
The exhibition included an interactive component where multiple participants could wear synchronized versions, creating a collective emotional landscape visible to all observers.
Research & References
This project builds on research in affective computing, physiological sensing, and e-textiles. Key inspirations include:
- Rosalind Picard's work on affective computing at MIT Media Lab
- Studio XO's wearable technology for performance
- Cutecircuit's interactive fashion designs
- Research on biosignal processing for emotion recognition
For more technical details, please view the open-source repository on GitHub.
Technical Documentation
Comprehensive build guide and code
Research Paper
Published findings and methodology