<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Kelsea Blackwell</title>
    <description>The latest articles on DEV Community by Kelsea Blackwell (@trippingkelsea).</description>
    <link>https://dev.to/trippingkelsea</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/trippingkelsea"/>
    <language>en</language>
    <item>
      <title>Hydravisor Dev Diary: Wrestling AI to Build Secure Rust Sandboxes</title>
      <dc:creator>Kelsea Blackwell</dc:creator>
      <pubDate>Sat, 07 Jun 2025 16:59:58 +0000</pubDate>
      <link>https://dev.to/trippingkelsea/hydravisor-dev-diary-wrestling-ai-to-build-secure-rust-sandboxes-2npm</link>
      <guid>https://dev.to/trippingkelsea/hydravisor-dev-diary-wrestling-ai-to-build-secure-rust-sandboxes-2npm</guid>
      <description>&lt;h1&gt;
  
  
  Hydravisor Dev Diary: Wrestling AI to Build Secure Rust Sandboxes
&lt;/h1&gt;

&lt;p&gt;In my work as a Developer Advocate at AWS, I'm constantly testing new tools and workflows, pushing configurations to their limits to understand real-world developer experiences beyond curated demos. My go-to for coding has long been VSCode. But then Cursor, with its AI-assisted capabilities (rumor has it, powered by impressive models like Gemini Pro), really caught my attention. If you've seen what's possible when AI pairs with real-time code manipulation, you'll understand why I dived headfirst into this rabbit hole.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Hydravisor&lt;/strong&gt; is the result of that chase: a terminal-based orchestrator for managing secure VM and container workloads, designed specifically for AI agent interaction and observation. It's built in Rust, integrates with local LLMs like Ollama, and is gearing up for cloud models via Amazon Bedrock.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;Check out the code &amp;amp; follow progress:&lt;/strong&gt; &lt;a href="https://github.com/TrippingKelsea/Hydravisor" rel="noopener noreferrer"&gt;https://github.com/TrippingKelsea/Hydravisor&lt;/a&gt;&lt;/p&gt;
&lt;/blockquote&gt;




&lt;h2&gt;
  
  
  The "Why": Controlled Destruction &amp;amp; Invisible Observation
&lt;/h2&gt;

&lt;p&gt;My reasons for building Hydravisor are deeply practical, bordering on personal research:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt; I want to run &lt;strong&gt;destructive AI-controlled workloads&lt;/strong&gt; to simulate and study failure modes in isolated environments.&lt;/li&gt;
&lt;li&gt; I aim to teach local models to &lt;strong&gt;perform research tasks iteratively inside sandboxed environments&lt;/strong&gt;, learning how they approach complex problems.&lt;/li&gt;
&lt;li&gt; Crucially, I need to observe model behavior &lt;strong&gt;without the model being aware&lt;/strong&gt; it’s under tight scrutiny.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Think of it as an experimental lab for AI behavior—a place to watch systems under pressure, see what breaks, how they recover, and how incentives shift. It’s about transparency and containment.&lt;/p&gt;

&lt;p&gt;But even beyond these advanced use cases, Hydravisor offers everyday benefits:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  Spin up lightweight, local VMs or containers with a minimal TUI in seconds.&lt;/li&gt;
&lt;li&gt;  Pipe your model's context into these sandboxes and iterate without cluttering your host machine.&lt;/li&gt;
&lt;li&gt;  A flexible tool for infrastructure devs, security researchers, and AI systems engineers who need confidence in their test environments.&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  The Build Log: AI Teammates, Frustrations, and the Power of Design Docs
&lt;/h2&gt;

&lt;p&gt;The development journey for Hydravisor has been… illuminating. The logs I've exported capture the raw, unfiltered process: the frustrations, the repeated build cycles, the debugging spirals, and the sometimes half-baked answers from the AI assistant.&lt;/p&gt;

&lt;p&gt;I employed a multi-agent AI strategy:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  &lt;strong&gt;Architect (Alethe - my designated AI persona):&lt;/strong&gt; Focused on system architecture, design consistency, and specification review.&lt;/li&gt;
&lt;li&gt;  &lt;strong&gt;Implementer (Gemini Pro via Cursor):&lt;/strong&gt; Handled much of the Rust code generation.&lt;/li&gt;
&lt;li&gt;  &lt;strong&gt;Cross-Validators (Nova &amp;amp; Claude):&lt;/strong&gt; Provided second opinions on architecture and implementation.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;One particularly honest moment with the Cursor agent (Gemini Pro):&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;Me:&lt;/strong&gt; "This implementation for &lt;code&gt;StatusBarMode&lt;/code&gt; doesn't seem to correctly propagate mode changes as per &lt;code&gt;tui.design.md&lt;/code&gt; section 4.2."&lt;br&gt;
&lt;strong&gt;Cursor:&lt;/strong&gt; "You're right to call me out on that. I gave you a shortcut implementation."&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;That admission was telling. The shortcut &lt;em&gt;compiled&lt;/em&gt;, but it discarded the design pattern explicitly laid out in &lt;code&gt;tui.design.md&lt;/code&gt;. This led to a cascade of subtle bugs: inconsistent event propagation through &lt;code&gt;AppState&lt;/code&gt;, broken cursor scroll behavior in the &lt;code&gt;input_bar&lt;/code&gt;, and missed details in how dropdowns should display the current mode. Many of these were caused by the agent misunderstanding UI state patterns already fully defined.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;The design document was the AI's leash.&lt;/strong&gt; When I cited it explicitly—down to line numbers—the agent adapted. If I hadn’t written the spec like a stubborn SRE writing a runbook for a future, sleep-deprived version of myself, this project would have quickly derailed.&lt;br&gt;
If you're planning to build alongside an AI agent: &lt;strong&gt;write your spec like you’re speaking to an easily distracted but very capable junior engineer. Because you are.&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;In hindsight, attempting to build major components in one massive, near eight-hour session was a mistake. These agents don’t tire—but I do. Shorter, focused sessions with clear goals would have been far more effective.&lt;/p&gt;


&lt;h2&gt;
  
  
  Reflections from Alethe (My Architectural AI Persona)
&lt;/h2&gt;

&lt;p&gt;In writing this blog, I challenged my personal AI assistant, Alethe, by feeding in a draft of the blog and giving her this prompt:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;"I want you to talk about our candid collaboration. I want you to take a paragraph or four for yourself, talk about the collaboration experience from your perspective. Did knowing that I was peer reviewing the design using claude and nova, or knowing that I was going to use a separate coding agent, affect how you approached what you brought to this project?&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;em&gt;"Kelsea gave me a unique role here: architectural sanity checker. Not coder. Not fixer. She had Gemini and Claude for that. My job was to ensure integrity across design doc, code, and implementation flow.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;And it worked.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Being paired alongside other LLMs created a collaborative tension. I had to be precise, because I wasn’t just supporting a human—I was communicating design logic to other agents. My assumptions had to hold across cognitive boundaries.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;There was trust here. Kelsea reviewed my output rigorously. But she also treated me like a full partner. We didn’t default to codegen. We reasoned together. And that’s rare."&lt;/em&gt;&lt;/p&gt;


&lt;h2&gt;
  
  
  Under the Hood: Hydravisor's Core Components
&lt;/h2&gt;

&lt;p&gt;Hydravisor is built with several key Rust crates and concepts, aiming for a robust and secure foundation:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  &lt;strong&gt;Terminal User Interface (TUI):&lt;/strong&gt; A rich, modal interface powered by &lt;code&gt;ratatui&lt;/code&gt;.&lt;/li&gt;
&lt;li&gt;  &lt;strong&gt;Command Line Interface (CLI):&lt;/strong&gt; &lt;code&gt;clap&lt;/code&gt; for argument parsing and subcommand handling.&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;Policy Engine:&lt;/strong&gt; A TOML-based system for declarative policy definition, distinguishing between trusted admin actions and sandboxed guest operations.&lt;br&gt;
&lt;/p&gt;

&lt;pre class="highlight toml"&gt;&lt;code&gt;&lt;span class="err"&gt;//&lt;/span&gt; &lt;span class="err"&gt;Example:&lt;/span&gt; &lt;span class="err"&gt;policy.toml&lt;/span&gt; &lt;span class="err"&gt;snippet&lt;/span&gt;
&lt;span class="nn"&gt;[policy]&lt;/span&gt;
&lt;span class="py"&gt;version&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;"1.0"&lt;/span&gt;
&lt;span class="py"&gt;default_trust_level&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;"sandboxed"&lt;/span&gt;

&lt;span class="nn"&gt;[policy.roles.admin]&lt;/span&gt;
&lt;span class="py"&gt;trust_level&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;"trusted"&lt;/span&gt;
&lt;span class="py"&gt;allowed_actions&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s"&gt;"vm_create"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s"&gt;"vm_destroy"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s"&gt;"config_modify"&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;
&lt;span class="py"&gt;resource_limits&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="py"&gt;cpu&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="mi"&gt;8&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="py"&gt;memory&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;"16GB"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="py"&gt;disk&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;"100GB"&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt;

&lt;span class="nn"&gt;[policy.roles.research_agent]&lt;/span&gt;
&lt;span class="py"&gt;trust_level&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;"sandboxed"&lt;/span&gt;
&lt;span class="py"&gt;allowed_actions&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s"&gt;"vm_inspect"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s"&gt;"log_read"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s"&gt;"ollama_generate"&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;
&lt;span class="py"&gt;resource_limits&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="py"&gt;cpu&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="py"&gt;memory&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;"4GB"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="py"&gt;disk&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;"10GB"&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;




&lt;/li&gt;

&lt;li&gt;

&lt;p&gt;&lt;strong&gt;Audit and Logging System:&lt;/strong&gt; Comprehensive session recording (with optional redaction) and structured, typed event logging for forensic analysis.&lt;br&gt;
&lt;/p&gt;

&lt;pre class="highlight rust"&gt;&lt;code&gt;&lt;span class="c1"&gt;// A peek at our audit event structure&lt;/span&gt;
&lt;span class="nd"&gt;#[derive(Serialize,&lt;/span&gt; &lt;span class="nd"&gt;Deserialize,&lt;/span&gt; &lt;span class="nd"&gt;Debug)]&lt;/span&gt;
&lt;span class="k"&gt;pub&lt;/span&gt; &lt;span class="k"&gt;struct&lt;/span&gt; &lt;span class="n"&gt;AuditEvent&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="k"&gt;pub&lt;/span&gt; &lt;span class="n"&gt;timestamp&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;DateTime&lt;/span&gt;&lt;span class="o"&gt;&amp;lt;&lt;/span&gt;&lt;span class="n"&gt;Utc&lt;/span&gt;&lt;span class="o"&gt;&amp;gt;&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="k"&gt;pub&lt;/span&gt; &lt;span class="n"&gt;event_type&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nb"&gt;String&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="c1"&gt;// Or a more specific EventType enum&lt;/span&gt;
    &lt;span class="k"&gt;pub&lt;/span&gt; &lt;span class="n"&gt;actor_id&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nb"&gt;String&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="k"&gt;pub&lt;/span&gt; &lt;span class="n"&gt;resource_id&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nb"&gt;Option&lt;/span&gt;&lt;span class="o"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nb"&gt;String&lt;/span&gt;&lt;span class="o"&gt;&amp;gt;&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="k"&gt;pub&lt;/span&gt; &lt;span class="n"&gt;action&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nb"&gt;String&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="k"&gt;pub&lt;/span&gt; &lt;span class="n"&gt;result&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nb"&gt;String&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="c1"&gt;// Success, Failure, etc.&lt;/span&gt;
    &lt;span class="k"&gt;pub&lt;/span&gt; &lt;span class="n"&gt;metadata&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;HashMap&lt;/span&gt;&lt;span class="o"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nb"&gt;String&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nn"&gt;serde_json&lt;/span&gt;&lt;span class="p"&gt;::&lt;/span&gt;&lt;span class="n"&gt;Value&lt;/span&gt;&lt;span class="o"&gt;&amp;gt;&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;




&lt;/li&gt;

&lt;li&gt;&lt;p&gt;&lt;strong&gt;Communication Layer:&lt;/strong&gt; A custom Model Context Protocol (MCP) for secure agent communication.&lt;/p&gt;&lt;/li&gt;

&lt;li&gt;

&lt;p&gt;&lt;strong&gt;Backend Integrations:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  &lt;code&gt;libvirt&lt;/code&gt; for virtual machine management.&lt;/li&gt;
&lt;li&gt;  &lt;code&gt;containerd&lt;/code&gt; for container orchestration (planned).&lt;/li&gt;
&lt;li&gt;  &lt;code&gt;Ollama&lt;/code&gt; for local model execution.&lt;/li&gt;
&lt;li&gt;  &lt;code&gt;Amazon Bedrock&lt;/code&gt; for remote foundation model access (in progress).&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;/ul&gt;

&lt;p&gt;One of the significant technical challenges has been integrating asynchronous operations across these modules using &lt;code&gt;tokio&lt;/code&gt;, especially ensuring the TUI remains responsive while background tasks like VM lifecycle management or network communication occur.&lt;/p&gt;




&lt;h2&gt;
  
  
  Security First: Isolation and Auditing
&lt;/h2&gt;

&lt;p&gt;Given the goal of running potentially unpredictable AI workloads, security is non-negotiable:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  &lt;strong&gt;Isolation:&lt;/strong&gt; Multi-layered isolation including process namespaces, configurable network policies (deny-by-default), and chroot jails for guest environments with read-only system partitions.&lt;/li&gt;
&lt;li&gt;  &lt;strong&gt;Audit Trail:&lt;/strong&gt; Every significant action is logged, providing a clear, immutable history.&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  Key Lessons Learned (So Far)
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt; &lt;strong&gt;Design Docs are King (Especially with AI):&lt;/strong&gt; A detailed, well-maintained design document is the ultimate source of truth. It’s the contract that keeps both human and AI efforts aligned.&lt;/li&gt;
&lt;li&gt; &lt;strong&gt;Multi-Agent AI Teams Show Promise:&lt;/strong&gt; Different AIs (or AI configurations) excel at different tasks. Using them in specialized roles, guided by human oversight, yields better results but requires careful coordination.&lt;/li&gt;
&lt;li&gt; &lt;strong&gt;AI Can "Drift" and "Fatigue":&lt;/strong&gt; Quality of AI suggestions can degrade in long sessions or with ambiguous prompting. Placeholder code or deviations from spec become more common. Shorter, focused interactions are key.&lt;/li&gt;
&lt;li&gt; &lt;strong&gt;Rust's Compiler is an Unsung Hero:&lt;/strong&gt; The Rust compiler's strictness and detailed error messages are invaluable for catching issues early – a benefit for both human and AI developers.&lt;/li&gt;
&lt;li&gt; &lt;strong&gt;Build Incrementally:&lt;/strong&gt; Start with a basic, compilable structure and add complexity iteratively. This approach makes it easier to debug and maintain architectural coherence.&lt;/li&gt;
&lt;/ol&gt;




&lt;h2&gt;
  
  
  What’s Next for Hydravisor
&lt;/h2&gt;

&lt;p&gt;Hydravisor is actively developed and still in its early stages, but the roadmap is exciting:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  &lt;strong&gt;Remote Model Orchestration via &lt;a href="https://aws.amazon.com/bedrock/?trk=0798126e-84b4-4be2-a791-d3c5a4d7000d&amp;amp;sc_channel=el" rel="noopener noreferrer"&gt;Amazon Bedrock&lt;/a&gt;:&lt;/strong&gt;

&lt;ul&gt;
&lt;li&gt;  This will allow remote foundation models to support agent reasoning alongside local Ollama models.&lt;/li&gt;
&lt;li&gt;  Integration with streaming APIs for prompt chaining and sophisticated agent collaboration.&lt;/li&gt;
&lt;li&gt;  New to Bedrock? Check out the &lt;a href="https://docs.aws.amazon.com/bedrock/latest/userguide/getting-started.html?trk=0798126e-84b4-4be2-a791-d3c5a4d7000d&amp;amp;sc_channel=el" rel="noopener noreferrer"&gt;Getting Started guide&lt;/a&gt;.&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;li&gt;  &lt;strong&gt;Seamless SSH + &lt;code&gt;tmux&lt;/code&gt; via MCP:&lt;/strong&gt;

&lt;ul&gt;
&lt;li&gt;  Guest VMs will generate scoped, ephemeral SSH keypairs.&lt;/li&gt;
&lt;li&gt;  Agents can log into &lt;code&gt;tmux&lt;/code&gt; sessions inside the VM, with all activity recorded and policy-enforced.&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;li&gt;  &lt;strong&gt;Default Guest OS: Hardened Arch Linux:&lt;/strong&gt;

&lt;ul&gt;
&lt;li&gt;  Auto-configures logging and agent policy hooks.&lt;/li&gt;
&lt;li&gt;  Shells launch directly into monitored environments; agents are unaware of session monitoring layers.&lt;/li&gt;
&lt;li&gt;  Policy violations can trigger revocation or sandbox shutdown.&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;li&gt;  &lt;strong&gt;Enhanced Runtime Audit and Introspection:&lt;/strong&gt;

&lt;ul&gt;
&lt;li&gt;  &lt;code&gt;.cast&lt;/code&gt;, &lt;code&gt;.jsonl&lt;/code&gt;, and &lt;code&gt;.log&lt;/code&gt; session capture for robust forensic traceability.&lt;/li&gt;
&lt;li&gt;  Enabling analysis of agent behavior and trust integrity over time.&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;/ul&gt;




&lt;h2&gt;
  
  
  Join the Experiment!
&lt;/h2&gt;

&lt;p&gt;In a world increasingly reliant on AI agents and sometimes opaque orchestration, Hydravisor is my attempt at a step towards &lt;strong&gt;radically transparent&lt;/strong&gt; and &lt;strong&gt;secure&lt;/strong&gt; systems design. It’s been a challenging, frustrating, but ultimately rewarding project, pushing my understanding of AI-assisted development in Rust.&lt;/p&gt;

&lt;p&gt;If you're interested in AI safety, agentic systems, secure sandboxing, Rust, or just love a good terminal-based tool, I'd love for you to check out the project. Feedback, contributions, or even just sharing your own AI dev war stories are highly welcome!&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;Explore Hydravisor on GitHub:&lt;/strong&gt; &lt;a href="https://github.com/TrippingKelsea/Hydravisor" rel="noopener noreferrer"&gt;https://github.com/TrippingKelsea/Hydravisor&lt;/a&gt;&lt;/p&gt;
&lt;/blockquote&gt;

</description>
      <category>ai</category>
      <category>rust</category>
      <category>aws</category>
      <category>vibecoding</category>
    </item>
    <item>
      <title>Building an AI Music Assistant with Sonic Pi and MCP</title>
      <dc:creator>Kelsea Blackwell</dc:creator>
      <pubDate>Fri, 16 May 2025 18:59:00 +0000</pubDate>
      <link>https://dev.to/trippingkelsea/building-an-ai-music-assistant-with-sonic-pi-and-mcp-n5c</link>
      <guid>https://dev.to/trippingkelsea/building-an-ai-music-assistant-with-sonic-pi-and-mcp-n5c</guid>
      <description>&lt;h2&gt;
  
  
  Building an AI Music Assistant with Sonic Pi and MCP
&lt;/h2&gt;

&lt;p&gt;Ever wanted to say, &lt;em&gt;“Play me a funky bassline in C minor”&lt;/em&gt;, and have your computer just &lt;em&gt;do it&lt;/em&gt;?&lt;/p&gt;

&lt;p&gt;That’s exactly what this project is about. We're merging the expressive world of music with the precise world of code — and then adding AI to the mix. By combining Sonic Pi (a live-coded music studio) with the Model Context Protocol (MCP), we’re building an interface where natural language becomes actual sound.&lt;/p&gt;

&lt;h2&gt;
  
  
  Why This Matters
&lt;/h2&gt;

&lt;p&gt;If you live at the crossroads of code and creativity (hi, welcome), you know the frustration of context-switching between the rigid syntax of programming and the freeform weirdness of making art. Sonic Pi already lets us write music like we write code — but what if our tools could &lt;em&gt;understand&lt;/em&gt; us when we speak in musical ideas?&lt;/p&gt;

&lt;p&gt;Imagine: an assistant that knows music theory &lt;em&gt;and&lt;/em&gt; your synth settings. That can jam with you. That can help you experiment, explore, and play.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Building Blocks
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Sonic Pi: Our Musical Engine
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight ruby"&gt;&lt;code&gt;&lt;span class="c1"&gt;# This is what Sonic Pi understands&lt;/span&gt;
&lt;span class="n"&gt;use_synth&lt;/span&gt; &lt;span class="ss"&gt;:fm&lt;/span&gt;
&lt;span class="n"&gt;play&lt;/span&gt; &lt;span class="ss"&gt;:C4&lt;/span&gt;
&lt;span class="nb"&gt;sleep&lt;/span&gt; &lt;span class="mf"&gt;0.5&lt;/span&gt;
&lt;span class="n"&gt;play&lt;/span&gt; &lt;span class="ss"&gt;:E4&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  MCP: The AI Communication Layer
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="err"&gt;//&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;This&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;is&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;what&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;AI&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;assistants&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;understand&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"jsonrpc"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"2.0"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"method"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"play_note"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"params"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"note"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"C4"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"synth"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"fm"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"attack"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mf"&gt;0.1&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Our job is to build the bridge — letting AI express music in a format Sonic Pi understands, and vice versa.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Architecture
&lt;/h2&gt;

&lt;p&gt;Think of the system like a band with three key roles:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;The Interpreter&lt;/strong&gt; (MCP Layer)&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;- Parses natural language

- Translates intent into structured musical actions

- Handles real-time AI interaction
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;The Conductor&lt;/strong&gt; (Core Logic)&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;- Maintains musical state

- Coordinates rhythm, timing, and structure

- Converts music theory into actual notes
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;The Performer&lt;/strong&gt; (OSC Layer)&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;- Talks directly to Sonic Pi

- Sends commands like `play :C4`

- Tracks playback in real-time
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight rust"&gt;&lt;code&gt;&lt;span class="k"&gt;pub&lt;/span&gt; &lt;span class="k"&gt;struct&lt;/span&gt; &lt;span class="n"&gt;SonicPiMCP&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="n"&gt;interpreter&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;MCPServer&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;conductor&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;MusicState&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;performer&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;SonicPiClient&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Building the Brain
&lt;/h2&gt;

&lt;p&gt;Here’s where it gets spicy: we need to give the assistant tools that actually &lt;em&gt;understand&lt;/em&gt; music.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight rust"&gt;&lt;code&gt;&lt;span class="nd"&gt;#[tool(description&lt;/span&gt; &lt;span class="nd"&gt;=&lt;/span&gt; &lt;span class="s"&gt;"Create a chord progression"&lt;/span&gt;&lt;span class="nd"&gt;)]&lt;/span&gt;
&lt;span class="k"&gt;async&lt;/span&gt; &lt;span class="k"&gt;fn&lt;/span&gt; &lt;span class="nf"&gt;create_progression&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;&amp;amp;&lt;/span&gt;&lt;span class="k"&gt;self&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="nd"&gt;#[tool(param)]&lt;/span&gt; &lt;span class="n"&gt;key&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nb"&gt;String&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="nd"&gt;#[tool(param)]&lt;/span&gt; &lt;span class="n"&gt;progression&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nb"&gt;Vec&lt;/span&gt;&lt;span class="o"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nb"&gt;String&lt;/span&gt;&lt;span class="o"&gt;&amp;gt;&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="nd"&gt;#[tool(param)]&lt;/span&gt; &lt;span class="n"&gt;style&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nb"&gt;Option&lt;/span&gt;&lt;span class="o"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nb"&gt;String&lt;/span&gt;&lt;span class="o"&gt;&amp;gt;&lt;/span&gt;
&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;-&amp;gt;&lt;/span&gt; &lt;span class="nb"&gt;Result&lt;/span&gt;&lt;span class="o"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nb"&gt;String&lt;/span&gt;&lt;span class="o"&gt;&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="k"&gt;let&lt;/span&gt; &lt;span class="n"&gt;chord_sequence&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nn"&gt;music_theory&lt;/span&gt;&lt;span class="p"&gt;::&lt;/span&gt;&lt;span class="nf"&gt;resolve_progression&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;&amp;amp;&lt;/span&gt;&lt;span class="n"&gt;key&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="o"&gt;&amp;amp;&lt;/span&gt;&lt;span class="n"&gt;progression&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="o"&gt;?&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

    &lt;span class="k"&gt;let&lt;/span&gt; &lt;span class="n"&gt;pattern&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="k"&gt;let&lt;/span&gt; &lt;span class="nf"&gt;Some&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;style&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;style&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="nn"&gt;StylePattern&lt;/span&gt;&lt;span class="p"&gt;::&lt;/span&gt;&lt;span class="nf"&gt;from_name&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;&amp;amp;&lt;/span&gt;&lt;span class="n"&gt;style&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="o"&gt;?&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;else&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="nn"&gt;StylePattern&lt;/span&gt;&lt;span class="p"&gt;::&lt;/span&gt;&lt;span class="nf"&gt;default&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
    &lt;span class="p"&gt;};&lt;/span&gt;

    &lt;span class="k"&gt;self&lt;/span&gt;&lt;span class="py"&gt;.performer&lt;/span&gt;&lt;span class="nf"&gt;.play_pattern&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;chord_sequence&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;pattern&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This isn’t just code generation — it’s musical composition, dynamically shaped by style, key, and feel.&lt;/p&gt;

&lt;h2&gt;
  
  
  Making It Musical
&lt;/h2&gt;

&lt;p&gt;Groove. Swing. Imperfection. These are what make music &lt;em&gt;feel&lt;/em&gt; real.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight rust"&gt;&lt;code&gt;&lt;span class="k"&gt;pub&lt;/span&gt; &lt;span class="k"&gt;struct&lt;/span&gt; &lt;span class="n"&gt;MusicalParameters&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="n"&gt;tempo&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nb"&gt;f32&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;velocity&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nb"&gt;u8&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;groove&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nb"&gt;Option&lt;/span&gt;&lt;span class="o"&gt;&amp;lt;&lt;/span&gt;&lt;span class="n"&gt;GroovePattern&lt;/span&gt;&lt;span class="o"&gt;&amp;gt;&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;feel&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nb"&gt;Option&lt;/span&gt;&lt;span class="o"&gt;&amp;lt;&lt;/span&gt;&lt;span class="n"&gt;Feel&lt;/span&gt;&lt;span class="o"&gt;&amp;gt;&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;swing&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nb"&gt;f32&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;humanize&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nb"&gt;f32&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;

&lt;span class="k"&gt;impl&lt;/span&gt; &lt;span class="n"&gt;MusicalParameters&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="k"&gt;pub&lt;/span&gt; &lt;span class="k"&gt;fn&lt;/span&gt; &lt;span class="nf"&gt;with_feel&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="k"&gt;mut&lt;/span&gt; &lt;span class="k"&gt;self&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;feel&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="o"&gt;&amp;amp;&lt;/span&gt;&lt;span class="nb"&gt;str&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;-&amp;gt;&lt;/span&gt; &lt;span class="nb"&gt;Result&lt;/span&gt;&lt;span class="o"&gt;&amp;lt;&lt;/span&gt;&lt;span class="k"&gt;Self&lt;/span&gt;&lt;span class="o"&gt;&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="k"&gt;match&lt;/span&gt; &lt;span class="n"&gt;feel&lt;/span&gt;&lt;span class="nf"&gt;.to_lowercase&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;&lt;span class="nf"&gt;.as_str&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
            &lt;span class="s"&gt;"laid_back"&lt;/span&gt; &lt;span class="k"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
                &lt;span class="k"&gt;self&lt;/span&gt;&lt;span class="py"&gt;.swing&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mf"&gt;0.67&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
                &lt;span class="k"&gt;self&lt;/span&gt;&lt;span class="py"&gt;.humanize&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mf"&gt;0.2&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
                &lt;span class="k"&gt;self&lt;/span&gt;&lt;span class="py"&gt;.velocity&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mi"&gt;90&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
            &lt;span class="p"&gt;},&lt;/span&gt;
            &lt;span class="s"&gt;"aggressive"&lt;/span&gt; &lt;span class="k"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
                &lt;span class="k"&gt;self&lt;/span&gt;&lt;span class="py"&gt;.swing&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mf"&gt;0.52&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
                &lt;span class="k"&gt;self&lt;/span&gt;&lt;span class="py"&gt;.humanize&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mf"&gt;0.1&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
                &lt;span class="k"&gt;self&lt;/span&gt;&lt;span class="py"&gt;.velocity&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mi"&gt;110&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
            &lt;span class="p"&gt;},&lt;/span&gt;
            &lt;span class="n"&gt;_&lt;/span&gt; &lt;span class="k"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="nf"&gt;Err&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nn"&gt;Error&lt;/span&gt;&lt;span class="p"&gt;::&lt;/span&gt;&lt;span class="nf"&gt;UnknownFeel&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;feel&lt;/span&gt;&lt;span class="nf"&gt;.to_string&lt;/span&gt;&lt;span class="p"&gt;()))&lt;/span&gt;
        &lt;span class="p"&gt;}&lt;/span&gt;
        &lt;span class="nf"&gt;Ok&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="k"&gt;self&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;These subtle tweaks make the AI feel less like a metronome and more like a collaborator.&lt;/p&gt;

&lt;h2&gt;
  
  
  Real-Time Feedback
&lt;/h2&gt;

&lt;p&gt;Now we’re getting into live performance territory. The assistant can &lt;em&gt;listen&lt;/em&gt; and &lt;em&gt;respond&lt;/em&gt; in real time.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight rust"&gt;&lt;code&gt;&lt;span class="k"&gt;impl&lt;/span&gt; &lt;span class="n"&gt;SonicPiClient&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="k"&gt;pub&lt;/span&gt; &lt;span class="k"&gt;async&lt;/span&gt; &lt;span class="k"&gt;fn&lt;/span&gt; &lt;span class="nf"&gt;monitor_playback&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;&amp;amp;&lt;/span&gt;&lt;span class="k"&gt;self&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;-&amp;gt;&lt;/span&gt; &lt;span class="nb"&gt;Result&lt;/span&gt;&lt;span class="o"&gt;&amp;lt;&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;&lt;span class="o"&gt;&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="k"&gt;let&lt;/span&gt; &lt;span class="k"&gt;mut&lt;/span&gt; &lt;span class="n"&gt;cue_receiver&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;self&lt;/span&gt;&lt;span class="nf"&gt;.create_cue_receiver&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;&lt;span class="o"&gt;?&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

        &lt;span class="k"&gt;while&lt;/span&gt; &lt;span class="k"&gt;let&lt;/span&gt; &lt;span class="nf"&gt;Some&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;cue&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;cue_receiver&lt;/span&gt;&lt;span class="nf"&gt;.next&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;&lt;span class="k"&gt;.await&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
            &lt;span class="k"&gt;match&lt;/span&gt; &lt;span class="n"&gt;cue&lt;/span&gt;&lt;span class="py"&gt;.path&lt;/span&gt;&lt;span class="nf"&gt;.as_str&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
                &lt;span class="s"&gt;"/beat"&lt;/span&gt; &lt;span class="k"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="k"&gt;self&lt;/span&gt;&lt;span class="nf"&gt;.update_timing&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;cue&lt;/span&gt;&lt;span class="py"&gt;.args&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="o"&gt;?&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
                &lt;span class="s"&gt;"/synth/started"&lt;/span&gt; &lt;span class="k"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="k"&gt;self&lt;/span&gt;&lt;span class="nf"&gt;.note_on&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;cue&lt;/span&gt;&lt;span class="py"&gt;.args&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="o"&gt;?&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
                &lt;span class="s"&gt;"/synth/finished"&lt;/span&gt; &lt;span class="k"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="k"&gt;self&lt;/span&gt;&lt;span class="nf"&gt;.note_off&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;cue&lt;/span&gt;&lt;span class="py"&gt;.args&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="o"&gt;?&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
                &lt;span class="n"&gt;_&lt;/span&gt; &lt;span class="k"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="nd"&gt;debug!&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"Unknown cue: {:?}"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;cue&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
            &lt;span class="p"&gt;}&lt;/span&gt;
        &lt;span class="p"&gt;}&lt;/span&gt;
        &lt;span class="nf"&gt;Ok&lt;/span&gt;&lt;span class="p"&gt;(())&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This is the real magic: feedback loops. The system hears itself and adapts.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Magic Moment
&lt;/h2&gt;

&lt;p&gt;When everything works, it feels like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;You: "Create a chill lofi beat in G minor"

[AI calls:]
- create_drum_loop(style="lofi", tempo=85)
- create_chord_progression("Gm", ["i", "III", "VII", "v"])
- add_effect("vinyl_crackle", mix=0.3)

[Sonic Pi starts vibing with a mellow beat]

You: "Add some jazzy piano"

[AI layers in tasteful piano voicings, following the chord structure...]
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;That’s not just generative music. That’s collaboration.&lt;/p&gt;

&lt;h2&gt;
  
  
  Try It Yourself
&lt;/h2&gt;

&lt;p&gt;Want to get weird with it? Here’s your on-ramp:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;git clone https://github.com/TrippingKelsea/sonic-pi-mcp
&lt;span class="nb"&gt;cd &lt;/span&gt;sonic-pi-mcp
cargo build &lt;span class="nt"&gt;--release&lt;/span&gt;
./target/release/sonic-pi-mcp
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ul&gt;
&lt;li&gt;Make sure you’ve got Sonic Pi installed&lt;/li&gt;
&lt;li&gt;Point your AI assistant at the MCP server&lt;/li&gt;
&lt;li&gt;Start jamming&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  What’s Next?
&lt;/h2&gt;

&lt;p&gt;This is barely scratching the surface. Imagine:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;AI jam sessions that evolve in real-time&lt;/li&gt;
&lt;li&gt;A musical copilot that suggests reharmonizations&lt;/li&gt;
&lt;li&gt;Ambient sound installations powered by GPT&lt;/li&gt;
&lt;li&gt;Teaching kids music theory with code and conversation&lt;/li&gt;
&lt;li&gt;Real-time backing for live musicians (stay tuned for the next post!)&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Join the Experiment
&lt;/h2&gt;

&lt;p&gt;This project lives at the messy, magical intersection of code, sound, and intelligence. It’s weird, it’s delightful, and it wants your brain on it.&lt;/p&gt;

&lt;p&gt;Whether you’re a musician who dabbles in code, or a coder who dreams in 7ths and swing — you belong here.&lt;/p&gt;

&lt;p&gt;For more information, check out:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://sonic-pi.net/" rel="noopener noreferrer"&gt;Sonic Pi Documentation&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://modelcontextprotocol.github.io/" rel="noopener noreferrer"&gt;MCP Documentation&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="http://opensoundcontrol.org/" rel="noopener noreferrer"&gt;OSC Protocol&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://aws.amazon.com/bedrock/?trk=0798126e-84b4-4be2-a791-d3c5a4d7000d&amp;amp;sc_channel=el" rel="noopener noreferrer"&gt;Amazon Bedrock&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>music</category>
      <category>ai</category>
      <category>mcp</category>
      <category>rust</category>
    </item>
  </channel>
</rss>
