DEV Community

Cover image for Why Markdoc for LLM Streaming UI
Abhay
Abhay

Posted on

Why Markdoc for LLM Streaming UI

Every AI chatbot I've built hits the same wall.

The LLM writes beautiful markdown — headings, bold, lists, code blocks. Then someone asks for a chart. Or a form. Or a data table with sortable columns.

Suddenly you need a component rendering layer. And every approach has tradeoffs.

That's why I built mdocUI: a streaming-first generative UI library that lets LLMs mix markdown and interactive components in one output stream.

The Problem

JSON blocks in markdown

Some teams embed JSON in fenced code blocks:

Here's your revenue data:
Enter fullscreen mode Exit fullscreen mode
```json:chart
{"type": "bar", "labels": ["Q1", "Q2", "Q3"], "values": [120, 150, 180]}
```
Enter fullscreen mode Exit fullscreen mode

This works until you're streaming. A JSON object that arrives token-by-token is invalid JSON until the closing brace lands. You either buffer the entire block (killing the streaming experience) or parse incomplete JSON (fragile).

JSX in markdown

Here's your data:

<Chart type="bar" labels={["Q1", "Q2", "Q3"]} values={[120, 150, 180]} />
Enter fullscreen mode Exit fullscreen mode

Models get confused. They mix HTML attributes with JSX props. They forget to close tags. The < character appears everywhere in normal text, making streaming parsing ambiguous.

Custom DSLs

Some teams invent their own syntax — [[chart:bar:Q1=120,Q2=150]] or similar. Now you're training the model on a format it's never seen, burning tokens on format instructions, and maintaining a custom parser.

Why Markdoc Tag Syntax

Markdoc is a documentation framework created by Stripe. It extends markdown with custom tag delimiters. In this article, I’ll show them as [% %] so Dev.to doesn’t try to parse them as Liquid:

Here's your revenue data:

[% chart type="bar" labels=["Q1","Q2","Q3"] values=[120,150,180] /%]

Revenue grew 12% quarter-over-quarter.

[% button action="continue" label="Show by region" /%]
Enter fullscreen mode Exit fullscreen mode

Three properties make this tag syntax ideal for LLM streaming:

  1. Unambiguous delimiter — the opening sequence is something you would never expect in normal prose, standard markdown, or fenced code blocks. A streaming parser can detect it without lookahead or backtracking.

  2. Models already know it — Markdoc is in training data (Stripe docs, Cloudflare docs). Models write it correctly without extensive format instructions.

  3. Prose and components coexist — no mode switching. The LLM writes markdown and drops components wherever they fit. The parser separates them as tokens arrive.

How mdocUI Works

mdocUI borrows only the tag syntax from Markdoc. We built our own streaming parser from scratch.

Architecture:

LLM tokens → Tokenizer → StreamingParser → ComponentRegistry → Renderer
Enter fullscreen mode Exit fullscreen mode

The tokenizer is a character-by-character state machine with three states: IN_PROSE, IN_TAG, IN_STRING. As tokens arrive, it separates prose from component tags.

The ComponentRegistry validates tag names and props against Zod schemas. Invalid tags get error boundaries, not crashes.

The Renderer maps AST nodes to React components. Every component is theme-neutral — currentColor, inherit, no hardcoded colors. Swap any component with your own.

Getting Started

pnpm add @mdocui/core @mdocui/react
Enter fullscreen mode Exit fullscreen mode
import { generatePrompt } from '@mdocui/core'
import { createDefaultRegistry, defaultGroups } from '@mdocui/react'
import { Renderer, useRenderer, defaultComponents } from '@mdocui/react'

const registry = createDefaultRegistry()

// Auto-generate system prompt from your component registry
const systemPrompt = generatePrompt(registry, {
  preamble: 'You are a helpful assistant.',
  groups: defaultGroups,
})

// In your React component
function Chat() {
  const { nodes, isStreaming, push, done } = useRenderer({ registry })

  return (
    <Renderer
      nodes={nodes}
      components={defaultComponents}
      isStreaming={isStreaming}
      onAction={(event) => console.log(event)}
    />
  )
}
Enter fullscreen mode Exit fullscreen mode

24 components are included: chart, table, stat, card, grid, tabs, form, button, callout, accordion, progress, badge, image, code-block, and more.

What's Next

mdocUI is alpha (0.6.x). The API is stabilizing. We're working on:

  • Vue, Svelte, and Solid renderers
  • Vercel AI SDK useChat bridge
  • Browser devtools for AST inspection
  • VS Code extension for Markdoc-style tag syntax highlighting

Try the playground: https://mdocui.vercel.app
GitHub: https://github.com/mdocui/mdocui
Docs: https://mdocui.github.io

Feedback, issues, and PRs welcome.

Top comments (0)