DEV Community

David_Chen
David_Chen

Posted on

I have alexithymia. I built an OpenClaw skill to help me keep my job.

OpenClaw Challenge Submission 🦞

This is a submission for the OpenClaw Challenge.

What I built

I can't feel my own emotions building up. That's what alexithymia is. You know that moment when you go "okay, I need a break"? I don't get that. My brain skips it. One minute I think I'm fine. Next minute I'm crying at my desk, or shouting, or just... gone. Completely shut down. No warning.

But here's the thing — everyone around me saw it coming. They watched my jaw clench for the past hour. They saw my expression go flat. They noticed my brows getting tighter and tighter. All the signs were right there on my face. I just couldn't read my own face.

That's how I lost my last job. "Sudden emotional outburst," they wrote. It wasn't sudden. I just couldn't feel it happening.

There's a role called a job coach. Someone who sits next to workers like me, watches our faces, and says "hey, let's go take a walk" before things get bad. It works. Really well, actually. But job coaches are rare, and the whole point of supported employment is that they eventually leave so you can work on your own.

Once they leave, I'm back to square one. My face is still broadcasting everything. Nobody's watching anymore.

So I made Emotion Watch. It's an OpenClaw skill that watches my face for me.

How I used OpenClaw

Every few minutes, it takes a webcam photo and analyzes my face for tension — furrowed brows, frozen expression, clenched jaw. The same things a job coach would notice if they glanced over. If I look fine, it stays quiet. If tension is building, it nudges me.

The nudge is two sentences. Always two sentences.

"Feeling a bit tight just now? Take a sip of water."

Then silence.

I spent a long time on that wording. "Tight" instead of "stressed" — because I don't know what stressed feels like. But tight? I can check my shoulders right now and tell you if they're tight. Body sensation words work for me. Emotion labels don't.

"Take a sip of water" instead of "try to relax." Relax how? With what muscle? "Take a sip of water" is something my body can do in the next three seconds. No interpretation needed.

And then it shuts up. A nudge, not a conversation.

How it actually works

capture.sh → grab one webcam frame
    ↓
Python script → crop to face only, discard the rest
    ↓
LLM vision → analyze facial tension
    ↓
Low stress  → stay silent, check again in 3 min
Moderate    → stay silent, shorten interval to 1 min
High        → two-sentence nudge, then 5 min cooldown
Enter fullscreen mode Exit fullscreen mode

The monitoring adapts. On a good day, I forget it's running. On a hard day, it catches things early — the way a job coach would lean over and check on me.

Privacy

I want to be direct about this. You're pointing a camera at someone with a disability. Privacy isn't a feature here. It's the baseline.

  • All processing is local. Nothing leaves the machine. No cloud, no API calls with my face in them.
  • The moment a frame is captured, it gets cropped to just the face. Background, desk, screen, other people — gone immediately.
  • Each new photo overwrites the last. There's no folder slowly filling up with pictures of me. One file, overwritten every cycle.
  • The analysis result is a word: low, moderate, or high. The photo is gone by the next cycle.
  • Close the laptop lid and it stops. No background daemon. No lingering process.

Think of it like Face ID. It reads your face, makes one decision, and forgets everything it saw. Except this one decides whether to remind you to drink water.

Demo

GitHub: https://github.com/YUHAO-corn/emotion-watch-skill

What I learned

The hardest part was the language. I went through dozens of phrasings before landing on "feeling a bit tight just now." Every word matters when your user can't process abstract emotion vocabulary. "Stressed" means nothing to me. "Tight" means something — I can feel tight shoulders, a tight jaw, a tight chest. That's the bridge.

The other thing I learned: for assistive tech, simple beats clever. Every moving part is something that can break. And for the people who need this most, a tool that works every time beats one that works beautifully most of the time.

I just want to keep my job. Emotion Watch helps me do that.

Top comments (0)