<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Varun Dasharadhi</title>
    <description>The latest articles on DEV Community by Varun Dasharadhi (@varundasharadhi).</description>
    <link>https://dev.to/varundasharadhi</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/varundasharadhi"/>
    <language>en</language>
    <item>
      <title>How I built an emotion-reading AI in 24 hours using Claude + Hume EVI</title>
      <dc:creator>Varun Dasharadhi</dc:creator>
      <pubDate>Sun, 03 May 2026 13:33:04 +0000</pubDate>
      <link>https://dev.to/varundasharadhi/how-i-built-an-emotion-reading-ai-in-24-hours-using-claude-hume-evi-3ggf</link>
      <guid>https://dev.to/varundasharadhi/how-i-built-an-emotion-reading-ai-in-24-hours-using-claude-hume-evi-3ggf</guid>
      <description>&lt;h2&gt;
  
  
  The Idea
&lt;/h2&gt;

&lt;p&gt;What if AI could actually feel what you're feeling — &lt;br&gt;
not just read your words?&lt;/p&gt;

&lt;p&gt;That was the spark behind EmpathIQ. Built solo in 24 &lt;br&gt;
hours for the Replit 10 Buildathon.&lt;/p&gt;

&lt;h2&gt;
  
  
  What It Does
&lt;/h2&gt;

&lt;p&gt;EmpathIQ combines:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;👁️ Facial emotion detection via webcam&lt;/li&gt;
&lt;li&gt;🎙️ Vocal emotion analysis via Hume EVI&lt;/li&gt;
&lt;li&gt;🤖 Claude API responses calibrated to both signals&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The result? An AI that responds to how you actually &lt;br&gt;
feel — not just what you type.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Feature That Surprised Me Most
&lt;/h2&gt;

&lt;p&gt;Smart Glasses mode 🥽&lt;/p&gt;

&lt;p&gt;Point the camera at someone ELSE. EmpathIQ reads &lt;br&gt;
THEIR emotion and gives YOU real-time coaching &lt;br&gt;
on what to say.&lt;/p&gt;

&lt;p&gt;Angry person in front of you?&lt;br&gt;
→ "Lower your voice and acknowledge their concern &lt;br&gt;
   without arguing"&lt;/p&gt;

&lt;p&gt;The future vision is Meta Ray-Ban integration — &lt;br&gt;
real-time emotional coaching in every room you &lt;br&gt;
walk into.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Tech
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;React + Vite&lt;/li&gt;
&lt;li&gt;face-api.js — facial emotion detection&lt;/li&gt;
&lt;li&gt;Hume EVI — vocal emotion AI&lt;/li&gt;
&lt;li&gt;Claude API — emotionally calibrated responses&lt;/li&gt;
&lt;li&gt;Recharts — emotion timeline chart&lt;/li&gt;
&lt;li&gt;Tailwind CSS&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  The Hardest Part
&lt;/h2&gt;

&lt;p&gt;Combining two real-time emotion signals (face + voice) &lt;br&gt;
into one coherent reading without lag or conflicts. &lt;br&gt;
The fusion panel took several iterations to get right.&lt;/p&gt;

&lt;h2&gt;
  
  
  What I'd Do Differently
&lt;/h2&gt;

&lt;p&gt;Start with the voice mode earlier — EVI integration &lt;br&gt;
took longer than expected and nearly didn't make &lt;br&gt;
the 24hr deadline.&lt;/p&gt;

&lt;h2&gt;
  
  
  What's Next
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Apple Watch pulse + biometric fusion&lt;/li&gt;
&lt;li&gt;Meta smart glasses integration&lt;/li&gt;
&lt;li&gt;Clinical/therapy version (HIPAA compliant)&lt;/li&gt;
&lt;li&gt;Mobile app&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Try It
&lt;/h2&gt;

&lt;p&gt;🔗 Live: &lt;a href="https://empathiq-studio--varundasharadhi.replit.app" rel="noopener noreferrer"&gt;https://empathiq-studio--varundasharadhi.replit.app&lt;/a&gt;&lt;br&gt;
🎬 Demo: &lt;a href="https://www.loom.com/share/ee3177d34b40404487115fca5f8366ed" rel="noopener noreferrer"&gt;https://www.loom.com/share/ee3177d34b40404487115fca5f8366ed&lt;/a&gt;&lt;br&gt;
⭐ GitHub: &lt;a href="https://github.com/VarunDasharadhi/Empathiq-Studio" rel="noopener noreferrer"&gt;https://github.com/VarunDasharadhi/Empathiq-Studio&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Would love your feedback! 🙏&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzjat9urae8uhap8my32y.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzjat9urae8uhap8my32y.png" alt=" " width="800" height="508"&gt;&lt;/a&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flmbgefy9va2msu896jtb.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flmbgefy9va2msu896jtb.png" alt=" " width="800" height="512"&gt;&lt;/a&gt;&lt;/p&gt;

</description>
      <category>ai</category>
      <category>buildinpublic</category>
      <category>react</category>
      <category>showdev</category>
    </item>
  </channel>
</rss>
