If you’ve built anything with OpenAI or other LLMs, chances are your prompts live as strings in your codebase:
from openai import OpenAI
client = OpenAI()
response = client.responses.create(
model="gpt-5",
input="Write a short bedtime story about a unicorn."
)
print(response.output_text)
This works fine. Everything is in one place, and you don’t need to think twice about it.
But this example is a single, one-line prompt. Most production-ready apps use dozens, sometimes hundreds. Some prompts stretch over hundreds of lines, with parameters and variations.
The Problem: Prompt Hell
Inlining prompts directly into your code doesn’t scale:
Files become filled with long blocks of text.
Every tweak means hunting down the right string, editing it, then rebuilding and redeploying.
Larger prompts with multiple parameters quickly get unreadable.
Copy-pasting across repos or services introduces errors.
You end up with what many call prompt hell — messy, time-consuming, and error-prone.
A Naive Fix (and Its Limits)
You might try moving prompts into YAML, JSON, or .env files. That’s better than inline, but it brings its own pain:
Manually editing YAML is brittle (one wrong indent breaks everything).
Validation is poor — missing parameters or wrong types only show up at runtime.
Collaboration gets awkward when multiple people are changing raw text files.
This is duct tape, not a workflow.
A Better Way: Externalize and Manage Prompts
Prompts deserve the same treatment as code and config: versioned, editable, and testable.
That’s why tools like Dakora exist — lightweight, open source, and built for small teams who don’t want to fight with prompt sprawl.
With Dakora you can:
- Keep prompts in a central vault.
- Edit them in a clean web UI instead of raw YAML.
- Sync changes into your app instantly.
- Back everything with local files under Git for transparency and history.
Quickstart
Getting started is a two-liner:
pip install dakora
dakora init
dakora playground
This opens a simple UI playground where you can manage and edit prompts in real time. No redeploys, no fiddly JSON.
Why It Matters
LLM apps evolve fast. If you’re shipping features, testing variations, or working with a team, you can’t afford to redeploy every time you tweak wording.
Prompt management isn’t just a “nice to have.” It’s the difference between hacking on the weekend and running a reliable product.
Try It Out
Dakora is open source and ready to use today:
Stop hardcoding prompts. Your team — and your future self — will thank you.
Photo by Mii Luthman on Unsplash
Top comments (1)
I'm super curious to hear what kind of apps you are building with LLMs.