DEV Community

Cover image for I built a generative UI Guitar Tutor app
Michael
Michael

Posted on

I built a generative UI Guitar Tutor app

I recently started learning guitar, so I’ve been looking at online tabs for chord progressions and songs. I've found online guitar tabs sites to be a bit rough to navigate and covered in ads.

Also, I don’t aways know exactly what chords I want to search for. I wanted to be able to say: "I'm a beginner. Show me a simple chord progression I should start with," and instead of getting a text explanation, I wanted to see the actual tab render on my screen.

So, I built it.

🔗 Repo: https://github.com/MichaelMilstead/ai-guitar-tutor

🔗 Live site: https://ai-guitar-tutor.tambo.co/

The “AI Guitar Tutor” uses Generative UI to let an LLM control my React components directly.

Here is how I built it.

The Tech Stack

  • Framework: Next.js (App Router)
  • UI: Normal React components
  • AI Orchestration / Generative UI: Tambo
  • Language: TypeScript

Generative UI

Most AI apps today are just chatbots. You type text, you get text back. But for learning an instrument, text output is almost useless for me. I don't want to read about a chord, I want to see the finger placement.

I used a generative UI SDK called Tambo, which lets you "teach" the AI how to use your React components. You give the AI a component and a schema (using Zod), and the AI decides when to render it and what props to pass.

Step 1: The Simple Component

First, I built a standard React component that renders guitar tabs. It takes data like strings, fret numbers, and a title.

GuitarTabs Component

// src/components/guitar-tabs.tsx
export default function GuitarTabs({
  columns = defaultColumns,
  stringLabels = ["E", "B", "G", "D", "A", "E"],
  title = "A basic chord progression",
}: GuitarTabsProps) {  return (
    <div>
       {/* Rendering logic to show each 'column' of numbers */}
    </div>
  );
};

Enter fullscreen mode Exit fullscreen mode

Step 2: Making it "Interactable"

This is where the magic happens. I didn't want to write the code to integrate with an LLM and figure out how to take the streamed responses and map them to text and components. I just wanted the AI to "use" my component.

Using Tambo, I wrapped my component to make it Interactable. This registers the component with the AI agent.

// src/components/guitar-tabs.tsx
import { withInteractable } from "@tambo-ai/react";
import { z } from "zod";

// Wrapper so Tambo can interact with it:
export const InteractableGuitarTabs = withInteractable(GuitarTabs, {
  componentName: "guitar-tabs",
  description: "A component for displaying guitar tabs",
  propsSchema: guitarTabsSchema,
});
Enter fullscreen mode Exit fullscreen mode

The guitarTabSchema tells Tambo “how” to use the component, and looks something like this:

// Description of the props schema so Tambo knows how to use it:
export const guitarTabsSchema = z.object({
  columns: z
    .array(
      z.object({
        label: z
          .string()
          .describe(
            "The label for this column (e.g., chord name like 'C' or 'Am', or position number)"
          ),
        positions: z
          .array(z.number())
          .length(6)
          .describe(
            "An array of exactly 6 fret numbers, one for each string (strings 1-6, where 1 is high string and 6 is low string). Use 0 for open strings, 1-12 for fretted strings, and -1 for unplayed strings."
          ),
      })
    )
    .describe(
      "An array of columns to display horizontally to create guitar tabs."
      ),
  stringLabels: z
    .array(z.string())
    .length(6)
    .describe(
      "An array of exactly 6 string labels, one for each string from high to low (e.g., ['E', 'B', 'G', 'D', 'A', 'E'] for standard tuning). Defaults to standard tuning if not provided."
    ),
  title: z
    .string()
    .describe(
      "The title for the guitar tabs to display to the user. Maybe a song name, or a chord progression name."
    ),
});
Enter fullscreen mode Exit fullscreen mode

Step 3: The Agent Loop with Tambo

In my main page, I just drop the interactable component in, and drop in Tambo's pre-built components for sending and showing messages. Tambo handles the rest. When I type "Show me a blues chord progression in E," Tambo:

  1. Sends my messages to its API
  2. Analyzes my intent.
  3. Decides the best way to respond is to use the GuitarTabs component.
  4. Generates the props for a blues progression.
  5. Renders the component with those props.

As a user, I just see the tab component update.

What's Next?

I'm planning to add:

  1. Audio Playback: When I see a chord progression, it would be nice to hear what they should sound like to see if I’m playing them correctly.
  2. "Duolingo" Mode: Instead of me asking for chords, I want the app to track my progress and suggests new chords based on what I've already learned.

Try it out

The project is open-sourced. Let me know what feature I should add next, or contribute yourself!

🔗 Repo: https://github.com/MichaelMilstead/ai-guitar-tutor

Top comments (0)