Intro
github.com/obinexusmk2/mmuko-os
Most people use keyboards, mice, or touchscreens.
I decided that was too boring.
So I built a web-based Fruit Ninja clone that you control using your hands in front of a camera — no controllers, no touch, just motion.
This isn’t just a game. It’s an experiment in what I call a “cybernetic interface” — where your body becomes the input device.
🚀 The Project
👉 Repo: https://github.com/obinexusmk2/fruit-ninja
👉 Web directory: /www
This version runs directly in the browser and uses hand tracking + gesture input to let you slice fruit in real time.
No downloads required. Just:
- Open the app
- Enable your camera
- Raise your hands
- Start slicing
Yes, you will look ridiculous. That’s part of the experience.
🧠 What Makes This Different?
Traditional games:
- Tap
- Click
- Press buttons
This project:
- Tracks your hands
- Maps motion → input
- Uses gestures as commands
You’re not pressing a button to slice fruit —
you’re performing the action physically.
That’s the key idea behind the “cybernetic interface” concept.
🎮 How It Works
From the gameplay demo:
- Raise both hands to start calibration
-
The system detects:
- Left hand
- Right hand
Once both are visible → countdown begins
Game starts
Gameplay rules:
- Slice fruit → +1 point
- Hit a bomb → instant game over
- Miss fruit → lose a life (3 lives total)
- Use both hands for better performance
As noted during testing:
“The faster you go, the less accurate it is — so you have to balance speed and control.”
So yes, you can panic-slap the air… but the game will punish you for it.
🧩 The “Cybernetic Interface” Idea
This project explores a simple idea:
What if input devices disappear?
Instead of:
- Keyboard → typing
- Mouse → clicking
You get:
- Hands → interaction
- Motion → commands
It’s not full cybernetics (no brain implants yet, calm down),
but it’s a step toward natural human input systems.
Your body becomes part of the computation loop.
⚙️ Tech Overview
The web version (/www) includes:
- Hand tracking (camera-based)
- Real-time gesture mapping
- Canvas/graphics rendering
- Game loop with physics-like interactions
- UI overlay for calibration and feedback
Also:
- Supports two-hand input
- Includes a calibration system
- Uses visual hand landmarks for tracking
😅 Challenges
Because of course it’s not perfect:
- Accuracy drops with fast motion
- Lighting affects detection
- Camera positioning matters
- One-hand mode is… chaotic
But honestly? That’s part of the fun.
🔥 Why This Matters
This isn’t about cloning Fruit Ninja.
It’s about exploring:
- Touchless interaction
- Gesture-based control systems
- Human-centered input design
Today it’s slicing fruit.
- Controlling apps
- Navigating interfaces
- Gaming without controllers
Or at the very least…
looking like a wizard in front of your laptop.
🎯 Try It Yourself
Clone the repo and run the web version:
git clone https://github.com/obinexusmk2/fruit-ninja
cd fruit-ninja/www
npm install
npm run dev
Then:
- Open in browser
- Allow camera
- Raise both hands
- Embrace the chaos
🧪 Final Thoughts
This project is an early experiment.
It’s messy, imperfect, and occasionally makes you look like you're fighting invisible bees.
But it proves something important:
We don’t need traditional controllers to interact with software.
Sometimes, all you need…
is your hands and questionable confidence.




Top comments (0)