

Droplets: An AI co-producer, living inside your DAW
Droplets: An AI co-producer, living inside your DAW
Presenter: Christophe Poucet — a neurodivergent engineer who spent 17 years at Google across Zürich, SF, and Munich, most recently pulling LLMs into code review. He left last year to build Droplets, an AI co-producer that lives inside your DAW, because he wants to jam with AI, not just prompt it.
What to Expect: Generative music tools can turn a prompt into a finished track in seconds — but the producer is reduced to a spectator. You can't grab a stem, tweak the bassline on the downbeat, or play along. You hit "regenerate" and hope.
Droplets is a different bet: an AI co-producer that lives inside your DAW as a CLAP/VST3 plugin, talks to an LLM over the Model Context Protocol (MCP), and emits MIDI — not finished audio. The producer stays in the chair. The AI becomes a second pair of hands in the room, throwing ideas while yours are already on the filter knob.
In this hour we'll unpack why real-time human-AI music collaboration is a harder (and more interesting) problem than "type a prompt, get a song," what MCP is and why it's the plumbing that makes this work, and how I built the whole thing solo in Rust over nine months. Then we'll actually jam — live, with Claude in the room, making house music together.
How to join:
This session is public and will be live-streamed: Sign-up to get the link.
Future Sesssions:
If you are not a member, join Munich Music Labs to get full access to our knowledge-sharing sessions:
Subscribe to our Luma calendar to stay up to date with upcoming events!