DEV Community

Cover image for Flappy Hand in 30 min with copilot cli
Shivam Kumar
Shivam Kumar

Posted on

Flappy Hand in 30 min with copilot cli

GitHub Copilot CLI Challenge Submission

This is a submission for the GitHub Copilot CLI Challenge

What I Built

I built FlappyHand, a hands-free interactive game where you control the character using hand gestures captured by your webcam. Inspired by the classic Flappy Bird, this game uses computer vision to track your hand state:

  • Open Palm: The character flaps and flies upward.
  • Closed Fist: The character descends due to gravity.

The application is built with Next.js and React, leveraging MediaPipe for real-time hand tracking directly in the browser. It features a scoring system, high score leaderboard, and smooth canvas-based rendering.

Demo

Web APP: https://sk-flappy.vercel.app/
-- simple Rule just open palm it will go up and fist go down.


My Experience with GitHub Copilot CLI

Building this project with GitHub Copilot CLI was an incredibly streamlined experience.

I really enjoyed using the GitHub Copilot CLI—it was smooth and handled tasks reliably. I started the project with only a vague prompt, and after 3–4 minor tweaks, the full app was up and running, effectively turning my idea into reality.

I used different models at different stages of the process:

  • For refining the idea and creating a solid plan, I used Gemini 3 Pro (Preview).
  • For development, I used Claude Opus 4.5
  • For tuning used GPT-5.2-Codex.

Top comments (1)

Collapse
 
notfritos profile image
Ryan Lay

CV controllers are under-explored! Cool project!