The moment an AI read a human before responding — not from cookies, but from a live biometric stream.
On March 5, 2026, something happened that I hadn't seen anyone do before.
An AI looked at a person through a webcam and understood their emotional state — stress level, heart rate, focus, authenticity — all in real time. Then it adapted. Tone, depth, pace, everything. Before the person typed a single word.
No wearables. No special hardware. Just a standard browser camera.
I'm Arvydas Pakalniskis, and I built EmoPulse — a real-time emotion AI platform that extracts 47 biometric and emotional parameters from any camera. 100% on-device. Zero cloud. I want to share the story of how and why.
The Problem: AI Knows What You Click, But Not How You Feel
Every major AI platform — ChatGPT, Claude, Gemini — operates blind. They analyze your words, your clicks, your browsing history. But they have zero understanding of your actual emotional state at the moment of interaction.
Think about that. Your doctor doesn't prescribe medication based only on what you tell them. They observe. They read your face, your posture, your voice. They see things you don't say.
AI can't do that. Until now.
What EmoPulse Actually Measures
EmoPulse isn't just "facial expression detection." That's table stakes. Here's what the platform captures in real time:
Emotion Detection — 7 core emotions (happy, sad, angry, fearful, surprised, disgusted, neutral) with confidence scoring, mood shift tracking, and emotional spectrum analysis.
Biometrics Without Wearables — Heart rate (BPM) and heart rate variability (HRV) extracted through remote photoplethysmography (rPPG) — measuring micro-color changes in your skin. Breathing rate. Blink detection.
Cognitive Metrics — Stress level, energy flow, focus score, cognitive load estimation.
Authenticity Analysis — TruthLens™ technology that distinguishes genuine Duchenne smiles from fake ones, detects micro-expressions lasting less than 500ms, and provides an overall authenticity score.
Eye & Gaze Analytics — Gaze tracking, pupil dilation (arousal indicator), gaze stability mapping, multi-face detection.
Voice Analysis — Voice emotion, pitch detection, energy levels, emotional contagion indexing.
47 parameters. From a webcam. In your browser.
The Tech Behind It
I didn't want EmoPulse to be another cloud-dependent SaaS that ships your face to some server. Privacy isn't a feature — it's the architecture.
100% Edge AI. Everything runs in the browser using TensorFlow.js. No data ever leaves the device. This isn't just a privacy choice — it's what makes EmoPulse viable for defence, healthcare, and any environment where data sovereignty matters.
The core consists of four proprietary algorithms:
NeuroMesh™ — A 68-point facial landmark tracking system with 5+ FACS action units
PulseSense™ — rPPG-based heart rate and HRV extraction from skin micro-color changes
TruthLens™ — Authenticity scoring via Duchenne marker analysis
MoodCast™ — Predictive emotion timeline with session memory
The entire system runs at sub-50ms latency, 30 FPS, using WebGL acceleration. It works offline. It works air-gapped. It works on a phone.
Why I Built This
I don't have a team of 50 engineers. I have obsession and a very clear vision.
Payments got Stripe. Communications got Twilio. Search got Google. AI got OpenAI.
Emotion gets EmoPulse.
This isn't a nice-to-have feature. As AI becomes more embedded in healthcare, education, hiring, security, and daily communication — the ability to understand the human on the other side becomes critical infrastructure.
Imagine an AI tutor that sees a student is confused before they ask a question. A telehealth platform that monitors patient stress in real time. A security system that detects deception without an interrogation. An HR tool that identifies burnout before it becomes a resignation letter.
That's what EmoPulse enables.
The Numbers
The target markets are massive:
Defence & Security — $49B addressable market
Healthcare — $15B
Education — $8B
HR & Wellbeing — $120B
Market Research — $80B
AI Platform Integration — $30B
And the competitive landscape? Affectiva measures about 8 parameters and requires cloud. Hume AI does roughly 12, also cloud-dependent. iMotions needs dedicated hardware costing thousands.
EmoPulse: 47 parameters, zero hardware, zero cloud, API starting at $0.01.
What's Next
Two patents are filed (EU/US), a third is in preparation. The technology is live and demonstrable at emopulse.app.
I'm currently raising a €500K–€2M seed round to:
Build the API and SDK for third-party integration
Secure first enterprise contracts in defence and healthcare
Expand the team
If you're building anything that involves humans interacting with screens — EmoPulse is the layer you're missing.
Try it live: emopulse.app
Live Dashboard: emopulse.app/dashboard
Contact: info@emopulse.app
LinkedIn: EmoPulse Official
Product Hunt: EmoPulse on Product Hunt
Tags: Emotion AI, Biometric AI, Edge AI, rPPG, Facial Expression Recognition, Computer Vision, TensorFlow.js, Startup, Deep Tech, Affective Computing
Top comments (0)