DEV Community

Cover image for Building a Horror Typing Game Where Blinking Kills You
Shahen
Shahen

Posted on

Building a Horror Typing Game Where Blinking Kills You

The Pitch

Imagine you're typing as fast as you can, trying to complete a horror story. But there's a catch. Every time you blink, a monster teleports closer. And you can't stop blinking.

Core Pillars

  • A full typing game with character-by-character validation
  • Daily AI generated stories & typing challenges
  • A Next.js frontend with real-time webcam face detection
  • Calibration system for accurate blink detection
  • A Unity WebGL game for a more immersive experience
  • Two-way communication between React and Unity
  • FMOD audio integration for that horror atmosphere
  • Leaderboard with realtime player high scores

The Tech Stack

Frontend:

  • Next.js 15 with App Router
  • React 19
  • TypeScript
  • Tailwind CSS v4
  • shadcn/ui components
  • Zustand for state management
  • React-unity-webgl package
  • Google MediaPipe

Backend:

  • Convex backend
  • Authentication
  • Anthropic AI SDK

Game Engine

  • Unity 6
  • FMOD Studio for audio

Development Tools:

  • Kiro IDE with specs, steering rules, agent hooks, and MCP servers

The Team

We’re two brothers from Mauritius with a shared passion for gaming and programming. Over the years, we’ve spent countless hours experimenting, learning, and building. Whether it’s software, web platforms, or the games we’ve always dreamed of creating. Our goal is simple: make things people genuinely enjoy using and playing.


Why Kiro Specs is a Game Changer

Hackathons are controlled chaos - limited time, big ideas, and the constant temptation to just “wing it.” That works… right up until you’re 12 hours deep, your blink detector won’t stop firing false positives, your Unity build keeps crashing, and you can’t remember why past-you thought that monster controller architecture was a good idea.

Then we started using Kiro specs. Here’s what changed.

Every feature we built followed the same pattern:

  1. requirements.md - What are we building? What does "done" look like?
  2. design.md - How are we building it? What are the algorithms?
  3. tasks.md - What are the exact steps to implement it?

This might sound like overkill for a hackathon, but it actually saved us time. When you're sleep-deprived at 2 AM and trying to remember how the Eye Aspect Ratio calculation works, having it documented in design.md is a lifesaver.

Real Example: The Blink Detector Spec

Let's look at how we specced out the blink detection system. This was one of our most complex features.

From requirements.md:

### Requirement 3: Eye Aspect Ratio Calculation

**User Story:** As a developer, I want the system to calculate Eye Aspect Ratio (EAR) 
from facial landmarks, so that eye openness can be quantified.

#### Acceptance Criteria

1. THE system SHALL use MediaPipe face mesh landmark indices for left eye 
   (33, 133, 159, 145, 160, 144) and right eye (362, 263, 386, 374, 385, 380)
2. WHEN facial landmarks are detected, THE system SHALL calculate EAR using 
   the formula: (vertical1 + vertical2) / (2.0 * horizontal)
3. THE system SHALL calculate Euclidean distance in 3D space (x, y, z) for landmark pairs
Enter fullscreen mode Exit fullscreen mode

Notice the EARS format (Event-Action-Response-State). It's precise. It's testable. When we implemented the hook, we knew exactly what success looked like.

From design.md:

### P4: Blink Detection State Machine
**Property:** A valid blink follows the state machine: open → closing → closed (2-15 frames) → opening → open
**Verification:** State transitions follow defined rules, blink count only increments on complete cycle
**Covers:** AC4.1, AC4.2, AC4.3
Enter fullscreen mode Exit fullscreen mode

The design doc gave us the algorithm. We didn't have to figure out blink detection logic on the fly. We had already thought through edge cases like "what if someone holds their eyes closed for 3 seconds?" (that's drowsiness, not a blink).

To be continued.

Top comments (0)