https://github.com/obinexusmk2/fruit-ninja/
- the web version
A first-person dialogue between Obi, Eze, and Uche on non-polar state algebra, augmented reality feedback loops, and the architecture of presence
by Nnamdi Michael Okpala | OBINexus Computing | @obinexus
Source: github.com/obinexusmk2/blog.md
Obi speaks first — because the heart speaks before the mind
I want to tell you about a moment in Phase 2.
I am standing in front of my laptop. My hand is raised, open palm facing the camera. The system is scanning me. On screen, a ring of concentric circles — cyan arcs, rotating nodes — is slowly lighting up, one node at a time. Phase 1 completed. Phase 2 is active. The label says: FILTER.
Not PASS. Not FAIL. FILTER.
And something shifted in me right there. Because I realised this system was not checking my credentials. It was not verifying a password. It was asking — in machine language, through geometry drawn on glass — whether I was actually here. Whether my presence was real, coherent, and intentional.
That is the moment I understood what we built when we built MMUKO-OS Ring Boot inside a Fruit Ninja game.
Eze explains the architecture — because vision needs structure
Let me be precise.
MMUKO-OS is a boot protocol. Not a boot animation — a boot protocol. The pseudocode defines its contract explicitly:
STRUCT MMUKO_BOOT_HANDOFF:
magic : CHAR[4] = "MMKO"
revision : UINT16 = 0x0001
firmware_id : CHAR[6] = "NSIGII"
outcome : MMUKO_BOOT_OUTCOME = HOLD
completed_phases : UINT8 = 0
last_completed_phase : MMUKO_BOOT_PHASE = 0
validation_flags : UINT32 = 0
handoff_checksum : UINT32 = 0
Notice the initial state: outcome = HOLD. Not PASS. Not FAIL. HOLD.
The system opens in suspension. It does not assume success. It does not assume failure. It opens a temporal frame — a negotiation window — where human and machine reach agreement together.
The six phases accumulate bitmask validation tokens:
Phase 1 — NEED_STATE_INIT flag: 0x00000001
Phase 2 — SAFETY_SCAN flag: 0x00000002
Phase 3 — IDENTITY_CALIBRATION flag: 0x00000004
Phase 4 — GOVERNANCE_CHECK flag: 0x00000008
Phase 5 — INTERNAL_PROBE flag: 0x00000010
Phase 6 — INTEGRITY_VERIFICATION flag: 0x00000020
Each REQUIRE statement in the boot function is not a boolean gate. It is a state interrogation. The trinary state algebra allows: YES, NO, MAYBE, MAYBE_NOT. The system can hold uncertainty. It can wait in MAYBE while more information arrives. It does not crash. It does not time out. It holds.
FUNC mmuko_boot() -> MMUKO_BOOT_HANDOFF:
handoff = MMUKO_BOOT_HANDOFF()
REQUIRE tier1_state != NO
complete_phase(handoff, PHASE_NEED_STATE_INIT, 0x00000001)
REQUIRE tier2_state != NO
REQUIRE nsigii_minimum_safety_envelope == TRUE
complete_phase(handoff, PHASE_SAFETY_SCAN, 0x00000002)
...
handoff.outcome = PASS
handoff.handoff_checksum = compute_handoff_checksum(handoff)
RETURN handoff
Only at Phase 6 — only when INTEGRITY_VERIFICATION completes — does outcome change from HOLD to PASS (0xAA). And even then, the handoff carries a CRC32 checksum over its entire state. The kernel entry contract requires that checksum to verify before anything executes.
This is not authentication. This is attestation of coherence.
Uche connects the layers — because thought bridges experience and structure
What Obi felt and what Eze described are the same event viewed at different altitudes.
At the human level: a person raises their hands. A ring appears. Phases complete. The game starts. Simple. Beautiful.
At the machine level: MediaPipe samples 21 landmarks per hand at ~30fps. Open palm is confirmed when all four finger tips have a lower y-coordinate than their proximal interphalangeal joints. Phase states transition. Bitmask flags accumulate. A CRC32 is computed. The kernel entry is released.
But between these two levels there is something that neither pure UX design nor pure systems architecture has a word for. We call it a cybernetic feedback loop.
Norbert Wiener, who coined cybernetics in 1948, defined it as: the science of control and communication in the animal and the machine. Not in the machine. Not in the animal. In both, simultaneously, as a single system.
The MMUKO Ring Boot running inside Fruit Ninja is that definition made visible.
Human raises hands (intent)
↓
MediaPipe detects open palm (recognition)
↓
Boot phase transitions MAYBE → YES (acknowledgement)
↓
Ring node lights green (confirmation)
↓
Human sees confirmation (feedback)
↓
Human holds position (sustained intent)
↓
Countdown completes (mutual commitment)
↓
Game begins (shared state achieved)
↓
Fruits spawn — human responds with swipe motion (new intent)
↓
Fingertip coordinates → blade trail → collision detection (recognition)
↓
Fruit slices, score increments (acknowledgement + confirmation)
↓
Human sees score → adjusts strategy (feedback → new intent)
↓ (loop continues at 60fps)
Every frame is a turn in the conversation. Human output becomes system input. System output becomes human input. This is not a user interface. This is a dialogue.
Obi on the AR layer — what it feels like when the screen disappears
When the game canvas is transparent — when clearRect() runs and the only thing drawn on it is a 22% dark wash before the fruits appear — you see yourself.
Not a reflection. Not a metaphor. You, the actual human being, visible through the camera feed behind the game. Fruits fly in front of your chest. Your hands, tracked in real time, leave cyan and orange trails across your own image.
// Game canvas: transparent background = camera visible underneath
gctx.fillStyle = 'rgba(0,0,0,0.22)';
gctx.fillRect(0, 0, W, H);
// Fingertip average → blade trail coordinate
// Mirror-correct: CSS scaleX(-1) on video requires (1 - lm.x)
const tipX = (lm[8].x + lm[12].x) / 2;
const tipY = (lm[8].y + lm[12].y) / 2;
const sx = (1 - tipX) * gameCvs.width;
const sy = tipY * gameCvs.height;
trails[trailIdx].points.push({ x: sx, y: sy });
The layer order is the thesis made visible:
| Layer | Element | z-index | Contents |
|---|---|---|---|
| 1 (back) |
#ready — camera video |
5 | You, live |
| 2 |
#ready-canvas — landmarks |
5 | Your hand skeleton |
| 3 |
#game-canvas — transparent |
10 | Fruits, blade trails |
| 4 (front) | #hud |
11 | Score, lives |
The screen does not disappear. The screen becomes you. The interface is no longer something you look at. It is something you are inside of.
That is augmented reality not as visual gimmick but as architectural intent.
Eze on non-polar, non-linear code — the philosophy that makes this possible
Most code is polar. Binary. true or false. 0 or 1. A condition is either met or it is not. This works well for computation. It fails for human-computer dialogue.
A human raising their hand is not a binary event. Their hand is moving. It is at different heights at different moments. Their palm is more open at some frames than others. The confidence of the landmark detection varies with lighting. A binary system would flicker constantly — DETECTED / NOT DETECTED / DETECTED / NOT DETECTED — ten times a second, triggering nothing and confusing everyone.
Non-polar state holds the uncertainty. MAYBE is a valid state that persists until enough evidence accumulates to resolve it. FILTER is a valid phase outcome — not pass, not fail, but under examination. The system stays in the negotiation window until consensus forms.
Non-linear lineage means the path through the code is not prescribed by the order of lines. The validation flags accumulate independently:
validation_flags = 0x00000001 (Phase 1 done)
validation_flags = 0x00000003 (Phase 2 done: 0x01 | 0x02)
validation_flags = 0x00000007 (Phase 3 done: 0x03 | 0x04)
...
validation_flags = 0x0000003F (all six: complete)
The bitmask carries the history of what was done. The checksum verifies that history is untampered. The system's memory is not sequential — it is cumulative. Any external observer can verify the state at any moment without replaying the entire execution.
This is how the kernel entry contract is safe:
KERNEL ENTRY CONTRACT:
REQUIRE handoff.magic == "MMKO"
REQUIRE handoff.revision == 0x0001
REQUIRE handoff.outcome == PASS
REQUIRE handoff.completed_phases == 6
REQUIRE VERIFY_CRC32(handoff) == TRUE
JUMP TO (handoff.kernel_entry_segment, handoff.kernel_entry_offset)
The kernel does not trust the boot sequence. It verifies the handoff artifact. Trust is not assumed — it is proven in the structure.
Uche on presence — what MMUKO-OS is really about
In Igbo — our language, our philosophy — there are words that have no clean translation.
Chi: the personal spirit. Your individual creative force and divine companion. Not filed away at birth. Present or absent, in each moment, in each action.
Obi: the heart, but also the inner compound — the central place of a family's home where decisions are made and presence is declared.
Uche: wisdom, but more specifically directed thought — the kind of thinking that moves with intention toward a goal.
Eze: king, but more precisely: the one who holds the space so others can act within it.
When I look at the six phases, I see these four things structured as a protocol:
- Phase 1 — NEED_STATE_INIT: Does a need exist? (Obi — is the inner chamber open?)
- Phase 2 — SAFETY_SCAN: Is the environment safe for action? (Chi — can the spirit move here?)
- Phase 3 — IDENTITY_CALIBRATION: Is this actually you? (Eze — who holds the space?)
- Phase 4 — GOVERNANCE_CHECK: Is your intent governed? (Uche — is the thinking directed?)
- Phase 5 — INTERNAL_PROBE: Is there internal coherence? (No contradiction between phases)
- Phase 6 — INTEGRITY_VERIFICATION: Does the whole verify? (CRC32 seals it)
PASS (0xAA). You may proceed.
Every game session begins with this ceremony. Not as delay. As declaration. The player arrives, stands before the camera, raises their hands, and the system says: I see six things about you, and all six are YES. You are here. Let us begin.
Obi, Eze, Uche — together at the end
The Fruit Ninja game is the accessible surface. Slice a banana. Avoid a bomb. Three lives. Score climbs.
The substrate is the architecture. MMUKO-OS, the ring boot, the trinary state algebra, the bitmask handoff, the CRC32 contract — this is an operating system that boots a human, not a kernel.
And the AR layer — the camera feed visible behind the transparent canvas, the hand skeleton glowing cyan, your own body present in the game space — is the moment where the loop closes and the metaphor dissolves.
You are not using the interface.
You are not looking at the interface.
You are the interface.
And the game — the fruit, the blade, the score, the three apple-shaped lives — is just the world the interface has agreed to share with you.
For now.
Stack: MediaPipe Hands · Canvas 2D · CSS AR Layering · MMUKO-OS Pseudocode Protocol · React Native Skia (Android)
OBINexus Computing — Nnamdi Michael Okpala
github.com/obinexusmk2/blog.md
Tags: #webdev #javascript #philosophy #computervision #augmentedreality #africantech




Top comments (0)