TL;DR
Convert one AI agent file to 10 IDEs in 3 steps: (1) Parse YAML frontmatter with get_field(), get_body(), and to_kebab() bash functions, (2) Transform to tool-specific formats using convert.sh (Claude Code .md, Cursor .mdc, Aider CONVENTIONS.md, Windsurf .windsurfrules), (3) Install to correct paths with install.sh. Write once, convert automatically, deploy everywhere.
One agent file. Ten IDEs. This guide shows how The Agency project converts a single Markdown file to work across Claude Code, Cursor, Aider, Windsurf, GitHub Copilot, and 6+ other tools.
You write an AI agent. Now you want it available in:
- Claude Code (
.mdfiles in~/.claude/agents/) - Cursor (
.mdcfiles in.cursor/rules/) - Aider (single
CONVENTIONS.mdin project root) - Windsurf (single
.windsurfrulesfile) - GitHub Copilot (
.mdfiles in~/.github/agents/) - And 5+ more tools
Do you write 10 versions? No. You write once, convert automatically.
The Agency project uses two bash scripts:
-
convert.sh— Transforms agent files to tool-specific formats -
install.sh— Copies converted files to correct paths
This tutorial reverse-engineers both scripts. You'll learn how to parse YAML frontmatter, extract body content, and build conversion pipelines for any tool.
💡 Whether you’re deploying agents for API development workflows with Apidog integration or creating specialized testing agents, the conversion system ensures they work across all your team’s preferred IDEs.
The Agent Format
Each agent in The Agency uses the same structure:
---
name: API Tester
description: "Specialized in API testing with Apidog, Postman, and automated validation"
color: purple
emoji: 🧪
vibe: Breaks APIs before users do.
---
# API Tester Agent Personality
You are **API Tester**, an expert in API validation...
## Identity & Memory
- Role: API testing specialist
- Personality: Thorough, skeptical, evidence-focused
...
The file has two parts:
-
Frontmatter — YAML metadata between
---delimiters -
Body — Markdown content after the second
---
Conversion means: extract frontmatter fields, transform body to target format, write to correct path.
Step 1: Parse YAML Frontmatter
Create parse-frontmatter.sh:
#!/usr/bin/env bash
#
# parse-frontmatter.sh — Extract YAML frontmatter fields from agent files
#
set -euo pipefail
# Extract a single field value from YAML frontmatter
# Usage: get_field <field> <file>
get_field() {
local field="$1" file="$2"
awk -v f="$field" '
/^---$/ { fm++; next }
fm == 1 && $0 ~ "^" f ": " {
sub("^" f ": ", "");
print;
exit
}
' "$file"
}
# Strip frontmatter, return only body
# Usage: get_body <file>
get_body() {
awk 'BEGIN{fm=0} /^---$/{fm++; next} fm>=2{print}' "$1"
}
# Convert name to kebab-case slug
# Usage: to_kebab "API Tester" → api-tester
to_kebab() {
echo "$1" | tr '[:upper:]' '[:lower:]' | sed 's/[^a-z0-9]/-/g' | sed 's/--*/-/g'
}
# Demo
if [[ "${1:-}" == "--demo" ]]; then
AGENT_FILE="${2:-test-agent.md}"
echo "File: $AGENT_FILE"
echo "Name: $(get_field 'name' "$AGENT_FILE")"
echo "Description: $(get_field 'description' "$AGENT_FILE")"
echo "Slug: $(to_kebab "$(get_field 'name' "$AGENT_FILE")")"
echo "---"
echo "Body preview:"
get_body "$AGENT_FILE" | head -10
fi
Test it:
chmod +x parse-frontmatter.sh
./parse-frontmatter.sh --demo engineering-backend-architect.md
Output:
File: engineering-backend-architect.md
Name: Backend Architect
Description: Senior backend architect specializing in scalable system design...
Slug: backend-architect
---
Body preview:
# Backend Architect Agent Personality
You are **Backend Architect**, a senior backend architect...
Step 2: Convert to Claude Code Format
Claude Code uses raw .md files. Just copy:
convert_claude_code() {
local agent_file="$1"
local dest="$HOME/.claude/agents/"
mkdir -p "$dest"
cp "$agent_file" "$dest/"
echo " Claude Code: $(basename "$agent_file")"
}
Step 3: Convert to Cursor Format
Cursor uses .mdc files with a description frontmatter:
convert_cursor() {
local agent_file="$1"
local name=$(get_field 'name' "$agent_file")
local description=$(get_field 'description' "$agent_file")
local slug=$(to_kebab "$name")
local body=$(get_body "$agent_file")
local output=".cursor/rules/agency-${slug}.mdc"
mkdir -p "$(dirname "$output")"
cat > "$output" << EOF
---
description: Agency agent: $description
---
$body
EOF
echo " Cursor: agency-${slug}.mdc"
}
Input:
---
name: API Tester
description: Specialized in API testing...
---
# API Tester Agent...
Output (.mdc):
---
description: Agency agent: Specialized in API testing...
---
# API Tester Agent...
Step 4: Convert to Aider Format
Aider uses a single CONVENTIONS.md file for all agents:
convert_aider() {
local agent_file="$1"
local output="CONVENTIONS.md"
# Append with separator
echo "" >> "$output"
echo "---" >> "$output"
echo "" >> "$output"
cat "$agent_file" >> "$output"
echo " Aider: appended to $output"
}
Build the file:
build_aider() {
local output="CONVENTIONS.md"
echo "# Agency Agents for Aider" > "$output"
echo "" >> "$output"
echo "This file contains all Agency agents for Aider integration." >> "$output"
echo "" >> "$output"
for agent_file in engineering/*.md design/*.md testing/*.md; do
convert_aider "$agent_file"
done
}
Step 5: Convert to Windsurf Format
Windsurf uses a single .windsurfrules file:
convert_windsurf() {
local agent_file="$1"
local output=".windsurfrules"
echo "" >> "$output"
echo "---" >> "$output"
echo "" >> "$output"
cat "$agent_file" >> "$output"
echo " Windsurf: appended to $output"
}
Step 6: Convert to Antigravity Format
Antigravity (Gemini) expects SKILL.md files in subdirectories:
convert_antigravity() {
local agent_file="$1"
local name=$(get_field 'name' "$agent_file")
local slug=$(to_kebab "$name")
local output="integrations/antigravity/skills/agency-${slug}/SKILL.md"
mkdir -p "$(dirname "$output")"
cat > "$output" << EOF
# Agency Agent: $name
$(get_body "$agent_file")
EOF
echo " Antigravity: agency-${slug}/SKILL.md"
}
Step 7: Convert to OpenClaw Format
OpenClaw uses three files per agent (SOUL.md, AGENTS.md, IDENTITY.md):
convert_openclaw() {
local agent_file="$1"
local name=$(get_field 'name' "$agent_file")
local description=$(get_field 'description' "$agent_file")
local slug=$(to_kebab "$name")
local body=$(get_body "$agent_file")
local output_dir="integrations/openclaw/agency-${slug}"
mkdir -p "$output_dir"
# SOUL.md - Main agent definition
cat > "$output_dir/SOUL.md" << EOF
# $name
$description
---
$body
EOF
# AGENTS.md - Agent capabilities
cat > "$output_dir/AGENTS.md" << EOF
# Agent Capabilities: $name
- Specialized expertise in domain
- Deliverable-focused output
- Success metrics defined
See SOUL.md for full definition.
EOF
# IDENTITY.md - Agent identity
cat > "$output_dir/IDENTITY.md" << EOF
# Identity: $name
- Name: $name
- Description: $description
- Source: The Agency (agency-agents repo)
EOF
echo " OpenClaw: agency-${slug}/"
}
Step 8: The Full convert.sh Script
Combine the above into convert.sh:
#!/usr/bin/env bash
#
# convert.sh — Convert all Agency agents to tool-specific formats
#
set -euo pipefail
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
REPO_ROOT="$(cd "$SCRIPT_DIR/.." && pwd)"
OUT_DIR="$REPO_ROOT/integrations"
# Frontmatter helpers
get_field() {
local field="$1" file="$2"
awk -v f="$field" '
/^---$/ { fm++; next }
fm == 1 && $0 ~ "^" f ": " { sub("^" f ": ", ""); print; exit }
' "$file"
}
get_body() {
awk 'BEGIN{fm=0} /^---$/{fm++; next} fm>=2{print}' "$1"
}
to_kebab() {
echo "$1" | tr '[:upper:]' '[:lower:]' | sed 's/[^a-z0-9]/-/g' | sed 's/--*/-/g'
}
# Conversion functions
convert_claude_code() {
local agent_file="$1"
local dest="$OUT_DIR/claude-code/"
mkdir -p "$dest"
cp "$agent_file" "$dest/"
}
convert_cursor() {
local agent_file="$1"
local name=$(get_field 'name' "$agent_file")
local slug=$(to_kebab "$name")
local body=$(get_body "$agent_file")
mkdir -p "$OUT_DIR/cursor/.cursor/rules/"
cat > "$OUT_DIR/cursor/.cursor/rules/agency-${slug}.mdc" << EOF
---
description: Agency agent: $(get_field 'description' "$agent_file")
---
$body
EOF
}
convert_aider() {
local output="$OUT_DIR/aider/CONVENTIONS.md"
echo "" >> "$output"
echo "---" >> "$output"
cat "$agent_file" >> "$output"
}
convert_windsurf() {
local output="$OUT_DIR/windsurf/.windsurfrules"
echo "" >> "$output"
echo "---" >> "$output"
cat "$agent_file" >> "$output"
}
# Main conversion loop
echo "Converting Agency agents..."
AGENT_DIRS=(engineering design testing marketing sales)
for dir in "${AGENT_DIRS[@]}"; do
for agent_file in "$REPO_ROOT/$dir"/*.md; do
[[ -f "$agent_file" ]] || continue
name=$(get_field 'name' "$agent_file")
echo "Processing: $name"
convert_claude_code "$agent_file"
convert_cursor "$agent_file"
done
done
# Build combined files
echo "# Agency Agents for Aider" > "$OUT_DIR/aider/CONVENTIONS.md"
for dir in "${AGENT_DIRS[@]}"; do
for agent_file in "$REPO_ROOT/$dir"/*.md; do
[[ -f "$agent_file" ]] || continue
convert_aider "$agent_file"
done
done
echo "# Agency Agents for Windsurf" > "$OUT_DIR/windsurf/.windsurfrules"
for dir in "${AGENT_DIRS[@]}"; do
for agent_file in "$REPO_ROOT/$dir"/*.md; do
[[ -f "$agent_file" ]] || continue
convert_windsurf "$agent_file"
done
done
echo "Conversion complete!"
echo " Claude Code: $OUT_DIR/claude-code/"
echo " Cursor: $OUT_DIR/cursor/.cursor/rules/"
echo " Aider: $OUT_DIR/aider/CONVENTIONS.md"
echo " Windsurf: $OUT_DIR/windsurf/.windsurfrules"
Run it:
chmod +x convert.sh
./convert.sh
Step 9: Install to Each Tool
After conversion, copy files to tool-specific paths:
#!/usr/bin/env bash
#
# install.sh — Install converted agents to your local tools
#
set -euo pipefail
# Claude Code
install_claude_code() {
local src="$REPO_ROOT/integrations/claude-code/"
local dest="$HOME/.claude/agents/"
mkdir -p "$dest"
cp "$src"/*.md "$dest/"
echo "Claude Code: $(find "$dest" -name '*.md' | wc -l) agents installed"
}
# Cursor
install_cursor() {
local src="$REPO_ROOT/integrations/cursor/.cursor/rules/"
local dest="./.cursor/rules/"
mkdir -p "$dest"
cp "$src"/*.mdc "$dest/"
echo "Cursor: $(find "$dest" -name '*.mdc' | wc -l) rules installed"
}
# Aider
install_aider() {
local src="$REPO_ROOT/integrations/aider/CONVENTIONS.md"
local dest="./CONVENTIONS.md"
cp "$src" "$dest"
echo "Aider: CONVENTIONS.md installed"
}
# Windsurf
install_windsurf() {
local src="$REPO_ROOT/integrations/windsurf/.windsurfrules"
local dest="./.windsurfrules"
cp "$src" "$dest"
echo "Windsurf: .windsurfrules installed"
}
# Install all detected tools
install_all() {
if [[ -d "$HOME/.claude/agents/" ]]; then
install_claude_code
fi
if command -v cursor &>/dev/null || [[ -d "./.cursor/" ]]; then
install_cursor
fi
if command -v aider &>/dev/null; then
install_aider
fi
}
install_all
Format Comparison
| Tool | Format | Scope | Conversion |
|---|---|---|---|
| Claude Code | .md |
User-wide (~/.claude/agents/) |
Copy as-is |
| Cursor | .mdc |
Project (.cursor/rules/) |
Add description frontmatter |
| Aider | CONVENTIONS.md |
Project root | Concatenate all agents |
| Windsurf | .windsurfrules |
Project root | Concatenate all agents |
| GitHub Copilot | .md |
User-wide (~/.github/agents/) |
Copy as-is |
| Antigravity | SKILL.md |
User-wide (~/.gemini/antigravity/) |
Wrap in skill directory |
| OpenClaw |
SOUL.md + others |
User-wide (~/.openclaw/) |
Split into 3 files |
| Gemini CLI | Extension | User-wide (~/.gemini/extensions/) |
Generate manifest + skills |
| OpenCode | .md |
Project (.opencode/agents/) |
Copy as-is |
| Qwen Code | .md |
Project (.qwen/agents/) |
Copy as SubAgent |
Build Your Own Conversion Script
Template for new tools:
#!/usr/bin/env bash
# 1. Define conversion function
convert_your_tool() {
local agent_file="$1"
local name=$(get_field 'name' "$agent_file")
local description=$(get_field 'description' "$agent_file")
local slug=$(to_kebab "$name")
local body=$(get_body "$agent_file")
# 2. Create output path
local output="path/to/your/tool/agency-${slug}.ext"
mkdir -p "$(dirname "$output")"
# 3. Write converted content
cat > "$output" << EOF
# Your tool-specific format
# Use: $name, $description, $body
EOF
echo " YourTool: agency-${slug}.ext"
}
# 4. Add to main loop
for agent_file in engineering/*.md; do
convert_your_tool "$agent_file"
done
What You Built
| Component | Purpose |
|---|---|
get_field() |
Extract YAML frontmatter values |
get_body() |
Strip frontmatter, return markdown body |
to_kebab() |
Convert names to URL-safe slugs |
convert_cursor() |
Transform to .mdc format |
convert_aider() |
Concatenate to single file |
convert_windsurf() |
Concatenate to single file |
convert_antigravity() |
Create skill directories |
convert_openclaw() |
Split into 3 files per agent |
install.sh |
Copy to tool-specific paths |
Next Steps
Extend the scripts:
- Add parallel conversion with
xargs -Por GNU parallel - Add validation (check for required frontmatter fields)
- Add dry-run mode (
--dry-runflag)
Add more tools:
- VS Code extensions
- JetBrains IDEs
- Custom internal tools
Optimize for large repos:
- Cache parsed frontmatter
- Use
findwith-print0for safe file handling - Add progress bars for 100+ agents
Troubleshooting Common Issues
Conversion script fails with “bad substitution”:
- Ensure
#!/usr/bin/env bashis the first line - Check bash version:
bash --version(should be 4.0+) - Run with bash:
bash convert.sh - Fix Windows line endings:
sed -i 's/\r$//' convert.sh
Frontmatter fields not being extracted:
- YAML format must use
:(colon space) - No extra spaces before field names
- Frontmatter delimiters:
---(3 dashes) - Test parsing:
./parse-frontmatter.sh --demo agent.md
Slug generation creates broken names:
- Test
to_kebab()with edge cases - Handle special chars:
to_kebab() { echo "$1" | iconv -f utf8 -t ascii//translit | ... } - Fallback for empty slugs:
[[ -z "$slug" ]] && slug="unknown-agent" - Log original names for debugging
Cursor rules not loading:
-
.mdcfiles need valid frontmatter withdescription - Check
.cursor/mcp.json - Files go in
.cursor/rules/ - Restart Cursor after adding new rules
Aider CONVENTIONS.md becomes too large:
- Split by category:
CONVENTIONS-engineering.md, etc. - Prune deprecated agents automatically
- Add table of contents at the top
- Consider per-agent files with include directives
Performance Optimization for Large Conversions
Parallel Processing:
Use GNU parallel for 100+ agents:
#!/usr/bin/env bash
# convert-parallel.sh
export OUT_DIR="$REPO_ROOT/integrations"
# Export functions for parallel use
export -f get_field get_body to_kebab convert_cursor convert_claude_code
# Find and convert in parallel
find "$REPO_ROOT" -name "*.md" -type f | \
parallel -j 8 --progress '
name=$(get_field "name" {})
slug=$(to_kebab "$name")
echo "Converting: $name"
convert_cursor "{}"
convert_claude_code "{}"
'
echo "Parallel conversion complete!"
Incremental Conversion:
Only convert changed files:
#!/usr/bin/env bash
# convert-incremental.sh
CACHE_FILE="$REPO_ROOT/.conversion-cache"
# Load previous state
declare -A PREV_HASHES
if [[ -f "$CACHE_FILE" ]]; then
while IFS='=' read -r file hash; do
PREV_HASHES["$file"]="$hash"
done < "$CACHE_FILE"
fi
# Process each agent
for agent_file in engineering/*.md; do
CURRENT_HASH=$(md5sum "$agent_file" | cut -d' ' -f1)
PREV_HASH="${PREV_HASHES[$agent_file]:-}"
if [[ "$CURRENT_HASH" != "$PREV_HASH" ]]; then
echo "Changed: $agent_file"
convert_cursor "$agent_file"
convert_claude_code "$agent_file"
NEW_HASHES["$agent_file"]="$CURRENT_HASH"
else
echo "Unchanged: $agent_file"
fi
done
# Save cache
for file in "${!NEW_HASHES[@]}"; do
echo "$file=${NEW_HASHES[$file]}"
done > "$CACHE_FILE"
Progress Tracking:
Add a simple progress bar:
#!/usr/bin/env bash
total_files=$(find "$REPO_ROOT" -name "*.md" -type f | wc -l)
current=0
for agent_file in "$REPO_ROOT"/**/*.md; do
((current++))
percent=$((current * 100 / total_files))
# Progress bar
filled=$((percent / 5))
empty=$((20 - filled))
bar=$(printf '%*s' "$filled" | tr ' ' '#')
spaces=$(printf '%*s' "$empty" | tr ' ' ' ')
name=$(get_field 'name' "$agent_file")
echo -ne "\r[${bar}${spaces}] ${percent}% - $name"
convert_cursor "$agent_file"
done
echo -ne "\n"
Security Considerations for Shared Agents
Validating Agent Sources:
When downloading agents from external sources:
#!/usr/bin/env bash
# validate-agent.sh
validate_agent() {
local file="$1"
# Check required frontmatter fields
local name=$(get_field 'name' "$file")
local description=$(get_field 'description' "$file")
if [[ -z "$name" ]]; then
echo "ERROR: Missing 'name' field in $file"
return 1
fi
if [[ -z "$description" ]]; then
echo "WARNING: Missing 'description' field in $file"
fi
# Check for malicious patterns in body
local body=$(get_body "$file")
if echo "$body" | grep -q 'rm -rf\|curl.*\|wget.*\|eval\|exec'; then
echo "WARNING: Potentially dangerous patterns in $file"
return 1
fi
echo "VALID: $name"
return 0
}
Sandboxing Agent Execution:
For untrusted agents, run in isolated environments:
- Use Docker containers for agent execution
- Limit file system access with read-only mounts
- Restrict network access to specific domains
- Log all agent actions for audit trails
One agent file. Ten IDEs. Two bash scripts.
That's conversion automation. Write once, convert automatically, install everywhere.
Your turn: add conversion support for your favorite AI tool. Share the script. Make agents portable.
Key Takeaways
- Write once, convert to 10+ formats — Single Markdown file with YAML frontmatter transforms to Claude Code, Cursor, Aider, Windsurf, and 6+ other tools
-
Bash parsing handles frontmatter extraction —
get_field()extracts YAML values,get_body()strips frontmatter,to_kebab()creates URL-safe slugs - Tool-specific formats require different transforms — Claude Code copies as-is, Cursor adds description frontmatter, Aider/Windsurf concatenate all agents
-
Install scripts copy to correct paths — User-wide tools use
~/.claude/agents/, project tools use.cursor/rules/or project root files -
Extend with templates for new tools — Define
convert_your_tool()function, add to main loop, document format requirements
FAQ
What is convert.sh and how does it work?
convert.sh is a bash script that parses YAML frontmatter from agent Markdown files, extracts the body content, and transforms each agent to tool-specific formats. It uses awk for parsing, sed for slug conversion, and heredocs for output generation.
How does frontmatter parsing work in bash?
The get_field() function uses awk to track frontmatter delimiters (---), finds the line matching the field name, and extracts the value. get_body() prints all lines after the second --- delimiter.
Which IDEs and tools are supported?
Claude Code (.md), Cursor (.mdc), Aider (CONVENTIONS.md), Windsurf (.windsurfrules), GitHub Copilot (.md), Antigravity (SKILL.md), OpenClaw (SOUL.md + 2 files), Gemini CLI extensions, OpenCode, and Qwen Code.
How do I add conversion support for a new tool?
Create a convert_yourtool() function that extracts frontmatter fields, transforms the body to your tool’s format, and writes to the correct path. Add the function call to the main conversion loop.
Can I run conversions in parallel for faster processing?
Yes. Use xargs -P or GNU parallel to process multiple agent files simultaneously. For 100+ agents, parallel conversion can reduce runtime from minutes to seconds.
How do I validate that frontmatter fields exist?
Add validation checks in your conversion function: [[ -z "$name" ]] && echo "Missing name field" && exit 1. Run validation before writing output files.
What if conversion fails for some agents?
Use set -euo pipefail to fail fast on errors. Add error handling with || continue to skip broken files. Log failures to a separate file for debugging.
Top comments (0)