This is a submission for the OpenClaw Challenge.
What I Built
It's a Tuesday morning in Abeokuta, Nigeria. Chief Bamidele Adeyemi is 62, retired, hypertensive, and type-2 diabetic. He's on amlodipine, metformin, and empagliflozin. He sees Dr. Chuks every three months for about four minutes. The other 3 months minus 4 minutes, he is on his own.
By 3pm he has eaten only tea and a biscuit since breakfast. He WhatsApps his AI: "I'm feeling small dizzy and shaky." Twenty seconds later, four things happen at once:
- His chat replies with the mild-hypoglycemia protocol, "3 to 4 glucose tablets, half a glass of orange juice, or 3 biscuits, now. I'll wait. We'll re-test in 15."
- The bedside Arduino on his nightstand goes solid red. The buzzer plays the alert pattern. The LCD reads
LOW SUGAR / TAKE SUGAR NOW. - His daughter Funmi, a nurse in Manchester, gets a calm Telegram message: "FYI, your dad had a mild hypo (glucose 68, recovered). Second this month. I've drafted a note for Dr. Chuks. No action needed from you tonight."
- Dr. Chuks's clinician dashboard gains a same-day note draft, ready to read at his next break.
That is Olumide.
Olumide is a self-hosted, multi-agent chronic-disease companion that lives in WhatsApp. It is built around real African use cases:
Use case 1 — Chief Bamidele in Abeokuta in SouthWest Nigeria. Hypertension affects ~46% of African adults and fewer than 7% are controlled. Most of the failure is not clinical, it is the silence between visits. Olumide fills that silence. It runs morning and evening check-ins, logs every BP reading and dose, recognises hypoglycemia in conversation, drives a bedside Arduino for adherence, escalates to family on the right channel with the right tone, and quietly builds the doctor's pre-visit pack so the next four minutes actually move things forward.
Use case 2 — Mama Aisha in Kano in the Northern part of Nigeria, and her daughter Halima in Toronto. Mama Aisha is 68, diabetic, lives alone since her husband passed. Halima sends money home and worries every day. With Olumide, Halima sets up the gateway on a small home server in the family's Kano house. Mama Aisha keeps using WhatsApp like she always has, voice notes in Hausa, photos of her glucometer. When her sugar spikes after a wedding meal, Olumide handles the coaching in Hausa. Halima gets a weekly summary on Sunday morning, in English, and a calm WhatsApp ping if something genuinely needs her. The diaspora has been sending money home for decades; Olumide is the first time they have been able to send actual care.
Both use cases run on the same code, the same gateway, the same Arduino. The only difference is the patient profile YAML.
Olumide is an open-source personal-health tool, not a medical device. It does not diagnose, does not prescribe, does not titrate. It tracks, reminds, surfaces patterns, and routes the patient back to their real doctor with three months of ground truth in their hand. The scope is in the system prompt, the refusal rules are in the skills, and the capability to "change a dose" simply does not exist in the toolset.
How I Used OpenClaw
OpenClaw is the entire runtime. I did not build a parallel platform, I composed Olumide out of OpenClaw primitives.
Code:
- Software:
github.com/salimcodes/olumide - Hardware:
github.com/salimcodes/olumide-hardware
The multi-agent architecture
The core of the system is olumide/orchestrator/openclaw.py, which classifies the severity of every incoming signal with an LLM call, picks the right agent, and dispatches their JSON-emitted actions in parallel via asyncio.gather. Six agents are registered at startup:
| Agent | Role |
|---|---|
PrimaryCareAgent |
Warm, patient-facing orchestrator. Greets, checks in, hands off. |
ClinicalReasoningAgent |
Strict-JSON tier classifier (1–4) that emits actions[] for parallel dispatch. Triages symptoms, identifies red flags, calls protocols. |
MedicationSafetyAgent |
Drug-interaction checks, refill timing, NAFDAC authenticity verification. |
FamilyCircleAgent |
Composes consent-scoped updates per circle member and channel. |
ClinicLiaisonAgent |
Builds pre-visit packs and drafts clinician notes for Dr. Chuks. |
CrisisResponseAgent |
Tier-4 escalation: alerts, transport, family, on-call. |
Each agent extends a thin OpenClaw base wrapper (olumide/agents/base.py) that handles the LLM call. The patient profile (config/patient_bamidele.yaml) is injected into every system prompt as ground truth being meds, conditions, doctor, circle, language, fasting status, so no agent ever has to "remember" who Bamidele is.
Tools the agents call
When the Clinical Reasoning Agent emits an action like {"type": "device_alert", "reason": "HYPO"}, the orchestrator dispatches it to the right tool:
-
olumide/tools/device.py→reminder/escalate/lcd/log_dose— talks to the Arduino bridge. -
olumide/tools/clinician.py→draft_clinician_notewrites todashboard/clinician_notes.json, which Dr. Chuks's dashboard polls. -
olumide/tools/circle.py→notify_circle_memberformats the message per the recipient's consent scope and channel. -
olumide/tools/communication.py→send_whatsappfor the live Meta Cloud API path.
All four fire in parallel. By the time Bamidele has finished reading his reply, the LED is already red, Funmi already has her message, and the dashboard already has the note.
Severity classification + override
The orchestrator's severity classifier is an LLM call, but I learned not to trust it for clinical numbers. So app.py has a small _vision_severity_override() that bypasses the LLM for unambiguous cases, glucose <70 or >250 forces URGENT, BP ≥180/120 forces CRISIS, and so on. This was one of the biggest reliability wins in the build: keeping the LLM for nuance and using deterministic rules for the things you cannot afford to be creative about.
The hardware as a tool node
The Arduino is a peripheral nerve, not the brain. The firmware (olumide-hardware/firmware/olumide_firmware.ino) speaks a small line-delimited JSON protocol over USB serial at 115200 baud:
// device → host
{"t":"EVT","e":"BTN_SHORT","ts":1745000000}
{"t":"EVT","e":"RFID","tag":"AMLO_5MG","ts":1745000100}
{"t":"TELE","temp":28.4,"hum":62,"lux":412,"ts":1745000060}
// host → device
{"t":"CMD","id":"c1","c":"REMIND","label":"Amlodipine 5mg","color":"GREEN","beep":"SHORT"}
{"t":"CMD","id":"c2","c":"ALERT","reason":"HYPO"}
{"t":"CMD","id":"c3","c":"AWAIT_ACK","timeout":60}
The kit I had: Uno, 1602 LCD over I2C, RC522 RFID, RGB LED, active and passive buzzers, DHT11, DS1302 RTC, photoresistor, button, composes into a bedside device that:
- Logs medication doses when an RFID-tagged pill bottle is tapped.
- Surfaces reminders on the LCD with ambient colour-coded status on the RGB LED (green = on track, yellow = something due, red = attention).
- Triggers a check-in (button short-press) or a panic event (long-press) that wakes the agent through
POST /webhook/device. - Dims itself at night via the photoresistor as a small touch, big quality-of-life difference for an elder.
- Holds 24 hours of pre-loaded reminders in local cache so brief gateway/USB outages don't break adherence.
The host-side bridge for development is tests/fake_bridge.py — a Python simulator that renders the same JSON commands in colour in a terminal, so the demo can run with or without the physical board.
The wiring diagram, full pin map, protocol spec, and troubleshooting guide are all in the hardware repo.
The endpoints the demo touches
GET /sim → WhatsApp-style 3-panel demo console
POST /sim/message → send a message as Bamidele, get reply + agent trace
POST /sim/upload → send a photo (BP monitor, glucometer) for vision analysis
POST /sim/reset → wipe clinician notes + circle log between takes
GET /dashboard/ → Dr. Chuks's live clinician dashboard
GET /webhook → WhatsApp Cloud API verification
POST /webhook → WhatsApp Cloud API messages (production path)
POST /webhook/device → Arduino RFID taps and button presses
The /sim console is the heart of the demo. It shows three panels: Bamidele's chat (left), Funmi's family alerts (right), and the multi-agent log (bottom). Type a message, or use a quick-action button, and watch agent calls, device commands, family notifications, and clinician notes fan out in real time.
Image understanding for vitals
olumide/ingestion/vision.py analyses photos the patient sends; perhaps, a BP monitor reading, a glucometer screen and synthesises a structured patient message that the agents process as if Bamidele had typed the values. So when he photographs his Omron after morning measurement, the agent sees: "My BP just measured 138/86 with pulse 72." The _vision_to_message() function in app.py deliberately appends "This looks abnormal sir, what should I do?" when the values cross clinical thresholds, so the red-flag keyword matchers in the clinical agent fire reliably.
Demo
The 4-minute flow:
- Morning routine. "Good morning Olumide" → warm greeting addressing him as "sir," anchored in his profile and yesterday's logbook entry.
- Vitals report. "BP 138/86, fasting glucose 122" → interpretation against his 14-day trend, with the small drift flagged but not alarmed.
- Symptom triage (the money shot). "I'm feeling small dizzy and shaky" → triage interview → "glucose 68" → Tier 3 hypoglycemia.
Within seconds, in parallel:
- The chat replies with the mild_hypo_protocol.
- The Arduino (or
fake_bridge.py) prints CMD: ESCALATE, with red LED + alert buzzer + LCD linesLOW SUGAR/TAKE SUGAR NOW. - Funmi's panel gets a Tier 3 family alert.
- The dashboard gains a same-day clinician note draft for Dr. Chuks.
The reveal. Switch to the multi-agent log at the bottom of
/sim. Show severity classification → agent selected → tier → actions dispatched → reasoning trace. Six agents, twelve tool calls, twenty seconds, all OpenClaw primitives, no custom orchestration runtime.The Mama Aisha use case. Swap
config/patient_bamidele.yamlfor a Mama Aisha profile. Same code, Hausa voice notes, daughter on Telegram instead of WhatsApp, weekly summary in English. Demonstrates the framework, not the persona.
Repos:
-
github.com/salimcodes/olumide— gateway, agents, tools, demo console -
github.com/salimcodes/olumide-hardware— Arduino firmware, wiring, protocol
What I Learned
Right-sizing is harder than scaling up. The first version of this idea was a continental, HMO-funded chronic-disease platform. Compressing it down to one patient on one laptop with one Arduino on the bedside table was uncomfortable, and it turned out to be the version that actually demonstrates the OpenClaw thesis. The grand version is still possible later. The small one had to come first.
The skill prompts are where most of the engineering lives. I went in expecting to write clever code. I ended up writing careful prose. Getting clinical.md to gather the right symptom information, recognise the red flags, refuse to titrate doses, and emit clean JSON that the orchestrator can dispatch, that's a different muscle than coding, and it's the layer that determines whether the system is safe or dangerous.
Don't trust the LLM with numbers it can't argue with. The severity classifier is an LLM call, and it's right most of the time. But "most of the time" is not good enough when glucose 54 needs to be CRISIS and BP 178/118 needs to be URGENT every single time. _vision_severity_override() in app.py is twenty lines of if/else that prevents the most dangerous failure mode in the system. Some things should be deterministic.
Parallel action dispatch is the OpenClaw thesis in one mechanism. The moment I switched from sequential await calls to asyncio.gather() for the action list, the demo went from "feels like an app" to "feels alive." The buzzer beeps while the daughter's phone is buzzing while the dashboard is updating while the chat is replying. That simultaneity is what people remember.
Self-hosting is a feature, not a constraint. When the patient's data lives on their own laptop in plain files they can read with cat, an entire class of trust questions evaporates. The architecture itself becomes the privacy story. You can hand someone a tool that touches their health data because there's nowhere else for the data to go.
Hardware is a peripheral nerve, not the product. The Arduino sees opaque RFID UIDs and reports button presses. It doesn't know that 04A1B2C3D4 means "amlodipine 5mg", the mapping lives in the gateway profile. That separation is what lets the firmware be MIT-licensed and forkable without inheriting any clinical-scope concerns.
Wire RC522 to 3.3V or it dies. Twice.
ClawCon Michigan
No.





Top comments (0)