A practical story about the missing protocol between AI and physical devices — and the open-source project building it.
The scenario that started everything
Imagine your grandmother lives with your family. One afternoon, no one is home. She falls in the bedroom and can't get up.
Your home has a smart lock, a camera with AI fall detection, a connected phone hub, and an alarm system. All of them are online. All of them are certified Matter devices. And none of them can coordinate a response — because no standard exists that lets an AI say "ensure her safety" and have every device figure out its role automatically.
The camera can detect the fall. But it can't tell the lock to open. The lock can open, but it doesn't know there's an emergency. The phone hub can call 911, but it doesn't know when to. Each device speaks its own command language, and the AI speaks in goals.
That gap — between AI intent and physical action — is what DoSync is built to close.
The core problem with existing protocols
Matter, Zigbee, and Z-Wave are excellent at what they do: letting apps control devices. They define command clusters like OnOff, LevelControl, and Lock that work across manufacturers. That's genuinely valuable.
But they were designed for human-initiated control via apps. When an AI enters the picture, these protocols become a bottleneck:
# What existing protocols understand
lock.unlock()
light.set_brightness(100)
thermostat.set_temperature(21)
# What an AI actually thinks in
"ensure the safety of the person who fell"
"prepare the home for bedtime"
"nobody is home — save energy"
Someone has to translate. Today, that translation is custom code written per-device, per-platform, per-scenario. It doesn't scale. It breaks when you add a new device. And it completely fails in emergencies where milliseconds matter.
Introducing DoSync
DoSync is an open communication protocol (Apache 2.0) that lets AI systems interact with physical home devices using semantic intent — expressing what they want to achieve, not how to achieve it.
The key insight: instead of the AI issuing commands, devices declare their capabilities, and a semantic resolver matches intent to capability automatically.
// A device announces what it can do (Capability Manifest)
{
"device_id": "lock-frontdoor-01",
"tags": ["door-lock", "entrance", "emergency"],
"actuators": [
{ "id": "unlock", "type": "unlock" },
{ "id": "lock", "type": "lock" }
],
"emergency_capable": true
}
// The AI sends an intent — not a command
{
"intent": "ensure_safety",
"urgency": "emergency",
"context": {
"trigger": "fall_detected",
"location": "bedroom",
"emergency_number": "911"
}
}
// DoSync resolves this automatically to:
// → lock-frontdoor-01: unlock (emergency_capable = true)
// → phone-hub-01: call 911
// → alarm-01: activate emergency pattern
// → family phones: push notification
// All in parallel. No hardcoded rules.
The 5-layer architecture
DoSync is organized in 5 layers, similar to OSI but designed for AI-to-device communication:
Layer 5 — Intent AI expresses goals in natural language or JSON
Layer 4 — Semantic Intent → device action mapping via capability matching
Layer 3 — Registry Devices self-declare capabilities on network join
Layer 2 — Secure ch. mTLS, local PKI, zero-trust — no internet required
Layer 1 — Transport WiFi · BLE · Zigbee · Z-Wave · Thread · Ethernet
The transport-agnostic HAL (Hardware Abstraction Layer) means the same protocol runs over any physical medium. A manufacturer implements DoSync once and supports any AI system.
What's built today
The protocol is not theoretical. There's a working reference implementation:
git clone https://github.com/giulianireg-spec/dosync-protocol
cd dosync-protocol
pip install fastapi uvicorn
# Start the hub
PYTHONPATH=. uvicorn server:app --host 0.0.0.0 --port 47200 --reload
# Run the full demo — 7 scenarios
PYTHONPATH=. python3 examples/demo_full.py
# Run the certification suite
python3 certify.py --host localhost --port 47200 --tier emergency
The demo output looks like this:
══════════════════════════════════════════════════════════════
Scenario 1 — Fall detected · Emergency
══════════════════════════════════════════════════════════════
▶ Camera emits event: fall_detected [EMERGENCY]
▶ AI resolves intent: ensure_safety [EMERGENCY]
✓ [lock-frontdoor-01] unlock → {"status": "unlocked", "duration_seconds": 300}
✓ [alarm-main-01] alarm → {"status": "activated", "pattern": "emergency"}
✓ [phone-family-01] call → {"status": "calling", "number": "911"}
✓ SUCCESS — 7 parallel actions in < 100ms
And the certification CLI:
── Tier EMERGENCY — Override and audit log ─────────────
✓ Emergency intent executes without confirmation
✓ Emergency-capable devices participate in response
✓ Audit log exists and has entries
✓ SHA-256 chain integrity verified
✓ Emergency event recorded in audit log
── Result ──────────────────────────────────────────────
Tests passed: 16 · Tests failed: 0
✓ CERTIFIED — DoSync EMERGENCY
Fingerprint: fb083f5740978b9a38c448bdc4fe3090…
The 7 working scenarios
The demo covers situations that no existing protocol handles automatically:
| Scenario | Trigger | DoSync response |
|---|---|---|
| Fall detected | Camera AI | 911 called · Door unlocked · Alarm · Family notified |
| Smoke / CO | Detector over threshold | 3 phases: Alert → Evacuate → Unlock for responders |
| Fridge failure | Compressor off, temp rising | Family SMS before food spoils |
| Nobody home | Phone WiFi + power meter | Lights off · Thermostat eco · Alarm armed |
| Laundry done | Appliance cycle complete | Remind family before clothes wrinkle |
| Good morning | First motion of the day | Blinds 80% · Coffee on · Weather shown |
| Bedtime | Scheduler at 21:30 | Lights dim to 10% · Blinds close |
The smoke scenario is worth highlighting because it introduces phased execution — unlike the others that run in parallel, a fire evacuation needs a specific order:
PhasedActionPlan(phases=[
Phase("ALERT — immediate notification", delay_after_ms=2000, actions=[
PhaseAction("alarm-main-01", "alarm", {"pattern": "fire"}),
PhaseAction("phone-family-01", "call", {"number": "100"}),
PhaseAction("phone-family-01", "notify", {"urgency": "emergency"}),
]),
Phase("EVACUATION — visual guidance", delay_after_ms=3000, actions=[
PhaseAction("lights-main-01", "set_brightness", {"brightness": 100}),
PhaseAction("lock-backdoor-01", "unlock", {"duration_seconds": 600}),
]),
Phase("ACCESS — entry for responders", actions=[
PhaseAction("lock-frontdoor-01", "unlock", {"duration_seconds": 600}),
PhaseAction("camera-exterior-01", "record", {"reason": "fire_emergency"}),
]),
])
Presence inference — not detection
One of the less obvious design decisions: DoSync doesn't treat presence as a binary sensor. It uses an OccupancyEngine that aggregates weighted signals from multiple context providers:
# Phone leaves WiFi network (weight: 0.7)
hub.update_presence(PresenceSignal(
device_id="phone-rodrigo-01",
signal_type=ContextSignalType.PRESENCE,
present=False,
confidence=0.7,
))
# Hub infers: occupied=False | confidence=100% | signals=1
state = hub.get_occupancy()
When a smartwatch is added, its GPS signal (weight 0.9) combines with the phone WiFi signal for a more robust inference. If the phone battery died but the smartwatch is home, the system correctly infers someone is still there. The AI learns patterns over time — this is the bridge to FamilyOS, the generational AI project that DoSync is designed to power.
The certification model
Following the Matter approach, DoSync uses self-certification with three tiers:
| Tier | Requirements |
|---|---|
| Basic | Connects, authenticates, publishes capability manifest |
| Standard | Responds to intents, sends events |
| Emergency | Emergency override + tamper-evident SHA-256 audit log |
The CLI generates a signed dosync-cert.json report. No manual approval needed for Basic and Standard. Emergency tier requires human review.
dosync-certify --host 192.168.1.50 --tier standard --output cert.json
The tamper-evident audit log is worth a separate mention: every action chains to the previous one with SHA-256, similar to blockchain. If someone modifies any historical entry, the entire chain breaks and the system detects it immediately.
What's different from Matter, Zigbee, and Home Assistant
Matter solves device interoperability for app-controlled scenarios. It has no semantic layer, no emergency override, and no concept of AI intent. DoSync doesn't replace Matter — a device can be Matter-certified for app control and DoSync-certified for AI interaction simultaneously.
Home Assistant is a platform, not a protocol. It recently added MCP support for natural language, which is directionally similar, but it's adapting an existing command-based system rather than designing from first principles for AI communication.
DoSync is the only open protocol designed specifically for AI-to-device semantic communication, with transport agnosticism, emergency override, capability-based discovery, and a certifiable standard that any manufacturer can implement.
What's next
The Raspberry Pi hardware kit is on order. The next milestone is a physical demo: a real door lock opening in response to a semantic emergency intent. That demo, combined with the working certification CLI, is the artifact we're taking to hardware manufacturers and IoT investors.
The roadmap:
Already shipped: SQLite persistence · 7 working scenarios · REST API · Certification CLI 16/16
- Q2 2026 — GPIO adapter for Raspberry Pi · Physical door lock demo
- Q3 2026 — Home Assistant bridge · BLE transport adapter
- Q4 2026 — 3 certified partner devices · First manufacturer outreach
- 2027+ — FamilyOS integration · Generational AI memory layer
Try it now
git clone https://github.com/giulianireg-spec/dosync-protocol
cd dosync-protocol
pip install fastapi uvicorn
PYTHONPATH=. uvicorn server:app --host 0.0.0.0 --port 47200 --reload
# Open http://localhost:47200/docs
The interactive API docs let you register devices, fire intents, and see the semantic resolver work in real time — no hardware required.
If you're building IoT firmware, smart home devices, or AI home assistants, I'd love to hear your feedback. The RFC process is open and transport adapter contributions are welcome.
GitHub: github.com/giulianireg-spec/dosync-protocol
License: Apache 2.0
Contact: giulianireg@gmail.com
DoSync is part of a larger project: FamilyOS — a private, local, generational AI for the home. The best inheritance we can leave our children is knowledge. DoSync is the protocol that lets the home itself become part of that inheritance.
Top comments (0)