DEV Community

Cover image for I found out EMDR apps charge $80/month for a CSS animation. So I built a free one with eye tracking.
Anastasiia Chestnykh
Anastasiia Chestnykh

Posted on

I found out EMDR apps charge $80/month for a CSS animation. So I built a free one with eye tracking.

A few months ago I started looking into EMDR therapy. It was helping, but every app I found was either full of ads, painfully basic, or charging $80/month for what amounts to a dot moving across a screen. Then I remembered I actually know how to build things.

So I started building my own tool. And then something unexpected happened.

I saw a video of an Indian classical dancer performing expressive eye movements. I noticed one of them looked exactly like EMDR bilateral stimulation. I dug deeper and found Drishti Bheda, a system of 8 eye movement patterns from the Natya Shastra, an ancient Indian text on performing arts. One of those patterns is essentially identical to what EMDR therapists use today. That's a 2000-year-old tradition and a 1989 clinical discovery arriving at the same movement independently.

That gave me a hypothesis: if one pattern turned out to be therapeutically useful, maybe the others have something to offer too. Pure experiment. I'm honest about this in the app: every pattern is labeled "researched", "preliminary", or "hypothesis" so users know exactly what's proven and what's exploratory.

What I built

Saccada is a browser-based eye movement practice with 12 patterns. All sound is synthesized in real-time via WebAudio API (bilateral panning, binaural beats, tanpura drone, no audio files). I integrated webcam eye tracking using MediaPipe so the app adapts dot speed to your actual gaze and builds a heatmap showing how well you followed the pattern. Every session can be customized: duration, speed, background, sound. Keyboard controls and fullscreen mode for complete immersion.

Privacy was non-negotiable. These are deeply personal sessions. I'm recording my emotional responses, my notes, my gaze data. None of that goes to anyone's server. Everything stays in IndexedDB on your device. Zero accounts, zero telemetry, zero data collection.

I also added Trataka, a yogic candle flame gazing practice. The flame is procedurally generated on Canvas. Ideally you'd do this with a real candle, but a digital version is better than nothing and you can practice anywhere.

The Sleep REM pattern is a hypothesis built on a hypothesis. A neuroscientist named Stickgold proposed in 2002 that EMDR works because it mimics REM sleep processing. That theory hasn't been confirmed. The better-supported explanation is that eye movements tax working memory, which degrades the vividness of recalled memories. But the REM connection intrigued me as a design direction for a pre-sleep pattern. Personally, I've been sleeping noticeably better after doing the REM pattern before bed. Anecdote, not evidence. I say that upfront.

What happened after I started using my own app

Some context: I have ADHD, CPTSD, and long-standing problems with memory and concentration. That's why I was looking into EMDR in the first place.

Since I started practicing EMDR patterns, intense emotional states stopped hijacking me the way they used to. And since I started testing the other patterns from the Indian tradition, old memories have been surfacing. Not flashbacks. Pleasant, long-forgotten things from early childhood. Maybe coincidence. Maybe placebo. But even if it's placebo, it works for me.

I haven't focused on the concentration patterns yet, so I can't speak to that. What I used most: EMDR Classic, Pralokita (same bilateral movement, different sound), Avalokita, Sleep REM, and especially Anuvritta (rapid vertical saccades). Anuvritta became my favorite and it's the most speculative pattern in the whole app.

Stack

React 19, TypeScript, Vite, Canvas 2D (requestAnimationFrame), WebAudio API (all sounds synthesized, zero audio files), MediaPipe Face Landmarker (gaze tracking pipeline), Zustand, Dexie.js/IndexedDB, Tailwind + shadcn/ui, PWA via vite-plugin-pwa. Feature-Sliced Design architecture. Deployed on Vercel.

The hardest engineering challenge was running Canvas animation, WebAudio synthesis, and MediaPipe gaze prediction simultaneously without frame drops. Each runs on its own update cycle, and coordinating them while keeping React out of the rendering hot path took some careful architecture.

Try it

The app is completely free, open-source (GPLv3), and always will be.

🔗 Live
🔗 GitHub

If you find this useful or interesting, a ⭐ on GitHub means a lot for a solo project. Feedback welcome.

Top comments (0)