Interactive LED installation development. Work is ongoing across seven axes of design:
- Topologies — sculpture shapes and spatial mapping
- Musical Events — what happens in music that LEDs should respond to
- Audio Features — what we can compute from audio in real time
- LED Behaviors — the visual vocabulary (sparkle, pulse, flow, growth...)
- Temporal Scope — frame-level transients to song-level arcs
- Composition — how effects layer, blend, and transition
- Perceptual Mapping — bridging audio features to visual parameters, e.g. color, brightness
Commits are welcome across any of these.
Audio Viewer — browser-based audio analysis and visualization testbed. Waveform, mel spectrogram, tap annotations, source separation (Demucs, HPSS), and real-time LED effect preview. Used for research and effect development.
Festicorn — ESP32-C3 firmware for finalized sculpture installations. Drives WS2812B strips with OKLCH color palettes, gamma-corrected output, OTA updates, and a wireless web UI for live control.
git clone https://github.com/SethDrew/led.git
cd led
python3 -m venv venv
source venv/bin/activate
pip install -r requirements.txtRun the viewer:
cd audio-reactive/viewer
python explore.pyOptionally install Demucs for 4-stem source separation:
pip install demucsMIT