DEV Community

empsoft
empsoft

Posted on

I Built a Focus-Scoring Browser Using Camera AI, Claude, and MediaPipe

Social media and content were already saturated. Then AI arrived and multiplied the volume of information and parallel tasks by orders of magnitude. In this world, I wanted to bring the value of focus I learned from meditation practice into a tool anyone could use — a browser that watches how you browse the web and shows you, in real time, how focused you are.

That's Gaze Browser — a mobile browser that scores your concentration from 0 to 100 using your front camera and AI face analysis.

The Problem I Wanted to Solve

We check our phones dozens of times a day. We open a browser to look something up, get distracted by a notification, click a link, and 20 minutes later we've forgotten what we originally searched for.

I wanted to build something that gently reminds you: "Hey, are you still focused?" — not by blocking content, but by reflecting your own behavior back at you.

How It Works

Gaze Browser uses Google MediaPipe FaceLandmarker to detect 478 facial landmark points in real time via your front camera. It then analyzes:

  • Gaze direction — Are you looking at the screen?
  • Blink rate — Are you alert or drowsy?
  • Head pose — Are you tilting or turning away?
  • Facial muscle tension — Are you engaged or relaxed?

These signals are combined using principles from cognitive science and behavioral analysis to produce a single focus score from 0 to 100, displayed as a real-time floating widget while you browse.

Key Technical Decisions

  • Native AI execution: Swift on iOS, Kotlin on Android — no web-based ML overhead
  • Flutter for UI: Cross-platform UI with platform channels for loose coupling between AI and interface
  • Personal calibration: The app learns your baseline focus posture during an initial calibration phase
  • Fully offline: All AI processing happens on-device. No data leaves your phone.

The Tech Stack

Layer Technology
AI/ML Google MediaPipe FaceLandmarker
iOS Native Swift + Platform Channels
Android Native Kotlin + Platform Channels
UI Framework Flutter
Testing Flutter integration_test + Fastlane
Dev Partner Anthropic Claude (Claude Code)

Building with Claude as My Dev Partner

I built Gaze Browser solo, but not alone. Claude (via Claude Code) was my constant development partner throughout the project. From architecture decisions to debugging platform channel issues between Flutter and native code, Claude helped me move fast without sacrificing quality.

Some specific ways Claude helped:

  • Designing the scoring algorithm based on cognitive science literature
  • Debugging iOS-specific MediaPipe integration issues
  • Writing and reviewing Flutter widget tests
  • Optimizing the real-time rendering pipeline to maintain 60fps while running face analysis

What I Learned

  1. Face analysis is surprisingly expressive — even small changes in blink rate or micro-expressions correlate with attention shifts
  2. Gamification works — showing a score triggers positive reinforcement and self-awareness
  3. Privacy-first AI is possible — running everything on-device means zero data compromise
  4. Solo development with AI assistance is a new paradigm — Claude Code made a solo project feel like a team effort

Try It Out

Gaze Browser is available now on the App Store:

👉 Download Gaze Browser

I'd love to hear your feedback, feature requests, or questions about the technical implementation. Drop a comment below!

Top comments (0)