Prompt engineering is string manipulation. BAML makes it type-safe function definitions.
What Is BAML?
BAML (Basically A Made-up Language) is a DSL for defining LLM functions with typed inputs and outputs. You define the interface, BAML generates the client code and handles parsing, retries, and validation.
// extract_resume.baml
class Resume {
name string
email string
skills string[]
experience Experience[]
}
class Experience {
company string
role string
years int
}
function ExtractResume(resume_text: string) -> Resume {
client GPT4o
prompt #"
Extract structured data from this resume:
{{ resume_text }}
{{ ctx.output_format }}
"#
}
# Generate client code
npx @boundaryml/baml-cli generate
// Auto-generated, fully typed client
import { b } from './baml_client'
const resume = await b.ExtractResume("John Doe, john@example.com, 5 years at Google as Staff Engineer, skills: Python, Go, K8s")
console.log(resume.name) // "John Doe" — typed as string
console.log(resume.skills) // ["Python", "Go", "K8s"] — typed as string[]
console.log(resume.experience) // [{company: "Google", role: "Staff Engineer", years: 5}]
Why BAML
- Type-safe — LLM outputs are validated against your schema
- Auto-retry — if parsing fails, BAML retries with corrective prompt
- Any LLM — OpenAI, Anthropic, Gemini, Ollama
- IDE support — VS Code extension with live preview
- Testing — unit test your LLM functions with fixtures
- Streaming — partial typed objects as they generate
pip install baml-py # or npm install @boundaryml/baml
Building AI applications? Check out my AI tools or email spinov001@gmail.com.
Top comments (0)