You take the Blue Pill, the story ends. You go back to writing
400‑line if/else blocks to handle every single JSON schema from
Stripe, Shopify, and GitHub. You keep paying OpenAI $0.50 every
time a user wants to see a bar chart of their own data. You stay in
the "Context Window" prison.
You take the Red Pill, you stay in Wonderland, and I show you how
Deep the data profiling goes.
The Problem: The "Generic Function" Trap
Every service---Stripe, GitHub, Jira, or your own internal
database---exposes data differently. As a developer, you usually face
two painful choices:
- Choice A: Write a custom parser for every single API provider. (Goodbye, weekend.)
- Choice B: Dump a 5MB JSON payload into an LLM and watch your API bill explode while the model hallucinates math that doesn't exist.
What I wanted was Choice C:
A plug‑and‑play SDK where the LLM acts as the Architect, but a
high‑performance engine (like Polars) acts as the Contractor.
The Redpill Philosophy: Profile, Don't Dump
Most AI tools fail because they try to be too smart. They scan
everything, which is slow and expensive.
Redpillx takes a different approach.
A local Data Profiler inspects a small sample (default: 100
rows) of your data first.
Benefits
Context Window Freedom
We only send the shape (schema) of your data to the LLM.
Whether you have 10 rows or 1 million rows, the token cost stays
the same.Deterministic Math
The LLM generates a ChartSpec (the instructions).
The actual calculation happens locally using Polars (Python)
or our optimized JavaScript execution engine.
🛠️ Tutorial: Building Your First Dynamic Chart
Let's see how easy it is to exit the simulation.
1. Installation
# For the JS/TS fans
npm install redpillx
# For the Python / Data Science crew
pip install redpillx
2. The "Bring Your Own LLM" Setup
Redpillx doesn't lock you into any specific provider.
You bring the brain (LLM).
Redpillx provides the muscle (execution engine).
import { Redpill } from "redpillx";
import OpenAI from "openai";
const openai = new OpenAI({
apiKey: process.env.OPENAI_API_KEY,
});
const rp = new Redpill()
.setLlm(async (messages) => {
const res = await openai.chat.completions.create({
model: "gpt-4o-mini",
messages,
});
return {
content: res.choices[0].message.content
};
})
.build();
3. Generate & Execute
Point Redpillx at any JSON data and ask a question in plain
English.
const myData = {
tickets: [
{ status: "open", priority: "high" },
// ...
]
};
// 1. The Architect creates the plan (Chart Spec)
const { spec } = await rp.generateSpec(
myData,
"Show me ticket count by status"
);
// 2. The Executor performs the calculation locally
const result = rp.execute(spec, myData);
console.log(result.data);
/*
Output:
[
{ x: "open", y: 42, labelX: "Status", labelY: "Count" }
]
*/
🧬 Why This Works
Because the Spec is separate from the Data, it becomes reusable.
If your data updates every minute, you don't need to call the LLM
again.
Just run:
rp.execute(spec, newData)
And you're done.
- ⚡ Sub‑millisecond execution
- đź§ LLM used only once
- đź’° Zero extra tokens
🤝 Credits & Inspiration
This project stands on the shoulders of giants:
PyGWalker
A massive inspiration for how interactive data exploration should be
feel.OpenCode
For supporting the spirit of open‑source collaboration.OpenRouter
For making it easy to test the SDK against dozens of models (Llama,
Claude (GPT) with a single API.
📦 Get Started
The project is fully open‑source and ready for contributions.
JavaScript SDK
GitHub -- red-pill-js | NPMPython SDK
GitHub -- red-pill-py | PyPI
Final Question
Which pill will you take?
Let me know in the comments how you're dealing with the " Dashboard
Tax" in your apps 👇
Top comments (2)
Some comments may only be visible to logged-in visitors. Sign in to view all comments.