What I built
Sloth Space is an AI-native workspace with three modes — slides, docs, and sheets. You describe what you need in natural language, and the AI drafts it in seconds.
Source: github.com/enigmaticsloth/sloth-space
The interesting part: unified architecture
The three modes aren't three separate apps bolted together. They share one state object, one intent router, one context menu system, one save pipeline, and one interaction model (single-click select, double-click edit). The AI sees the entire app as one system, not three.
~30 ES module files, pure vanilla JS, zero build step, zero frameworks.
The AI controls the UI
This is the part I'm most excited about. The LLM doesn't just generate text — it operates the application interface directly.
27 whitelisted JS functions are exposed to the model. It can switch modes, create projects, organize files, link documents together, and navigate — all from a single chat input.
Try typing: "Create a project called Q2, write a budget report and put it in there"
The system runs this as a multi-step sequence:
- Create project "Q2"
- Switch to doc mode
- Generate budget report
- Auto-link the doc to the project
With a Monet-orange overlay showing each step in real-time.
How it works under the hood
- An intent router (Llama 8B on Groq free tier) classifies every user input into one of 11 intents
- The router returns JSON with function names + arguments
-
executeUIActions()does whitelist check → schema validation → execution -
resolveActionRefs()handles fuzzy name-to-ID resolution — "open that file" finds the right file - Destructive actions require user confirmation before execution
- Context memory lets the AI resolve "that file" or "open it" from recent actions
Cross-file context injection
Link files to a project, and the AI reads ALL of them when generating new content.
- "Summarize this project" → reads every linked doc and sheet
- "Create slides from the research" → cross-references all project files
- Mention any file by name in chat → AI pulls it into the prompt automatically
No copy-pasting between files. No manual context-setting.
Tech stack
- Pure vanilla JS ES modules — no React, no Vue, no bundler
- All state in one
Sobject (no state management library) - All module exports auto-bound to
windowviaObject.entriesloop - CSS split into 14 module files
- Supabase for optional cloud sync + GitHub OAuth
- BYO API key — works with Groq, OpenAI, Grok, Ollama, Claude
- PPTX export, 5 slide themes, 18 spreadsheet formulas
Built solo in 4 days
Would love to hear thoughts — especially on the "LLM as UI controller" pattern. Is giving the model direct (whitelisted) control over the interface the right direction, or is there a better way to sandbox it?
Top comments (0)