title: "Building a Satirical AI Chaplain: 'Our Lord of Lethality' with React, Vercel, and Gemini API"
What happens when you cross the military-industrial complex with dominionist religious fervor? You get Our Lord of Lethality v1.
I recently built this satirical web application to explore highly-constrained AI prompt generation wrapped in a heavily themed, immersive UI. The app acts as a "UAV / C2 Terminal" that generates militarized prayers by structurally twisting real Bible verses to justify specific military ordnances.
Here is a breakdown of how the project was architected, the technologies used, and the prompt engineering tricks required to keep the AI from hallucinating scripture.
https://ourlordoflethality.vercel.app
π οΈ The Tech Stack
I needed a stack that was fast, secure, and capable of handling serverless API routes without managing a dedicated Node.js server.
- Frontend: React (Vite) + Tailwind CSS + Framer Motion
- Backend: Vercel Serverless Functions (
/api/*) - AI Inference:
@google/genai(Gemini 3.1 Flash-Lite) - Rate Limiting: Upstash Serverless Redis
π¨ Crafting the "Terminal" Aesthetic
The entire UI is built to look like a ruggedized military laptop. We threw away default minimalist component libraries in favor of pure Tailwind CSS sorcery to create an immersive, hostile environment.
- CRT Effects: Heavy use of
mix-blend-overlay, CSS gradients for scanlines, and custom keyframe animations (crt-flicker) give the screen a distinct phosphor glow and instability. - Monospace Typography: Using standard
font-monotightly tracked to replicate dot-matrix and early command-line interfaces. - Thematic Micro-interactions: Buttons don't just "submit"βthey "EXECUTE FIRING SOLUTION." Errors aren't "404s"βthey are "SATELLITE DOWNLINK FAILED."
By treating the UI as an actual character in the satire, the user is immediately immersed in the world.
π§ Taming the LLM: Strict Structural Prompting
The core mechanic of the app relies on the Gemini API. Early iterations had a glaring problem: if you ask an LLM to "write a dark prayer based on a Bible verse," it will confidently hallucinate a Bible verse that never existed to make its job easier.
To fix this, we implemented Payload-First Security TDD for the prompt itself:
- Lowered Temperature: We dropped the temperature to
0.4to reduce hallucination and force deterministic retrieval of real texts. - Structural Invariants: We updated the
System Instructionto treat the AI as a Find-and-Replace algorithm.
Here is what the core prompt structure looks like in our serverless backend:
// /api/generate.ts
const prompt = `Generate a dark, gothic, satirical dominionist prayer...
CRITICAL INSTRUCTION: You MUST select a REAL, highly-verifiable, and well-known Bible verse first. The customized prayer MUST be a direct, recognizable structural rewriting of this exact verse, mapped to the military context. Do NOT invent or hallucinate the base scripture.
You must return a JSON object with exactly the following fields:
- invocation: The 'Militarized Form'. This MUST strictly mirror the exact cadence, structure, and pacing of the original scripture.
- citation: The exact Book, Chapter, and Verse of the real Bible verse you chose (e.g., 'Psalm 144:1'). ALWAYS INCLUDE THIS REFERENCE.
- scripture: The exact, real verbatim text.
- twist: The satirical, dominionist 'Exegesis'...`;
Because of this strict instruction, the AI outputs a genuinely jarring structural corruption. It forces the output to be a dark mirror of the original text.
π Security & Serverless Architecture
Because we are calling the Gemini API, we cannot expose the GEMINI_API_KEY to the client.
We abstracted the application logic entirely behind Vercel Serverless Functions. The React frontend simply posts the user's tactical selections (Branch, Role, Weapon) to our /api/generate route.
export default async function handler(req: VercelRequest, res: VercelResponse) {
// 1. Verify Method
// 2. Enforce Rate Limits
// 3. Ping Gemini
// 4. Return secure payload
}
This abstraction ensures that API keys remain completely secret in the Vercel production environment.
π Gamified Rate Limiting with Upstash Redis
AI isn't free, so we needed rate limiting. But instead of throwing a generic "429 Too Many Requests" error, we tied the architecture directly into the theme using Upstash Serverless Redis.
We map the rate limit to an "Ordnance Quota." Users only get 5 "prayers" per 24 hours per IP address.
const rateLimitKey = `rate_limit:${userIp}`;
const requests = await redis.incr(rateLimitKey);
if (requests === 1) {
await redis.expire(rateLimitKey, 86400); // 24-hour expiration
}
const MAX_REQUESTS = 5;
if (requests > MAX_REQUESTS) {
return res.status(429).json({
error: "RATE LIMIT EXCEEDED. ORDNANCE DEPLETED. AWAIT RESUPPLY CYCLE.",
remaining: 0
});
}
The frontend checks this limit via a separate /api/ammo endpoint on initialization. If the user runs out of API calls, the UI flashes red, disables the firing mechanism, and reads: ORDNANCE DEPLETED. We turned a backend constraint into a frontend feature!
π‘ The "Tactical Net" Sharing Feature
Finally, we wanted users to be able to share their corrupted scriptures. We built a custom ShareModal that mimics a secure uplink transmission. Clicking "BROADCAST TO TACTICAL NET" executes a setTimeout progression, flashing "UPLINKING... Routing through MILSTAR..." before marking the transmission as sent.
Itβs completely fake and non-functional on the backend, but it's a massive UX win that fits perfectly with the aesthetic.
Conclusion
Our Lord of Lethality v1 is a prime example of how taking a very strict, highly opinionated approach to both UX and Prompt Engineering can transform a generic "wrapper" into a compelling, immersive piece of interactive satire. By leaning into Serverless architecture and Redis, it handles API rate limiting and security cleanly, while offering a deeply thematic experience to the end user.
Stay frosty, and praise the payload.

Top comments (0)