<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Anna Jambhulkar</title>
    <description>The latest articles on DEV Community by Anna Jambhulkar (@anna2612).</description>
    <link>https://dev.to/anna2612</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/anna2612"/>
    <language>en</language>
    <item>
      <title>Why I’m building a Windows-first emotional AI assistant (lessons so far)</title>
      <dc:creator>Anna Jambhulkar</dc:creator>
      <pubDate>Mon, 22 Dec 2025 13:21:10 +0000</pubDate>
      <link>https://dev.to/anna2612/why-im-building-a-windows-first-emotional-ai-assistant-lessons-so-far-1iii</link>
      <guid>https://dev.to/anna2612/why-im-building-a-windows-first-emotional-ai-assistant-lessons-so-far-1iii</guid>
      <description>&lt;p&gt;Most AI products today are optimized for speed, accuracy, and scale.&lt;/p&gt;

&lt;p&gt;And that makes sense.&lt;/p&gt;

&lt;p&gt;But while using AI tools daily, I kept running into the same feeling:&lt;br&gt;
every interaction felt stateless. Every session started from zero.&lt;br&gt;
No memory. No continuity. No sense of knowing the user.&lt;/p&gt;

&lt;p&gt;That’s where my curiosity started.&lt;/p&gt;

&lt;p&gt;The problem I noticed&lt;/p&gt;

&lt;p&gt;Modern AI assistants are impressive, but they behave like strangers who forget you every day.&lt;/p&gt;

&lt;p&gt;You explain your preferences again.&lt;br&gt;
You restate context again.&lt;br&gt;
You rebuild workflows again.&lt;/p&gt;

&lt;p&gt;From a technical perspective, this is fine.&lt;br&gt;
From a human perspective, it feels broken.&lt;/p&gt;

&lt;p&gt;Humans don’t work in isolated prompts — we work in continuity.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Why Windows-first (and not cloud-first)&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;One decision I made early was to build this as a Windows-first assistant, not a browser tab or a purely cloud-based tool.&lt;/p&gt;

&lt;p&gt;Why?&lt;/p&gt;

&lt;p&gt;Because a personal computer is still the most intimate computing device we own:&lt;/p&gt;

&lt;p&gt;It holds our files&lt;/p&gt;

&lt;p&gt;It reflects our workflows&lt;/p&gt;

&lt;p&gt;It stays with us for years&lt;/p&gt;

&lt;p&gt;Building locally (or at least desktop-native) allows:&lt;/p&gt;

&lt;p&gt;Better context awareness&lt;/p&gt;

&lt;p&gt;Stronger privacy boundaries&lt;/p&gt;

&lt;p&gt;Tighter integration with daily work&lt;/p&gt;

&lt;p&gt;Instead of AI being “somewhere on the internet”, it becomes present.&lt;/p&gt;

&lt;p&gt;Emotional AI ≠ pretending to be human&lt;/p&gt;

&lt;p&gt;A common misconception:&lt;br&gt;
emotional AI means making the assistant sound emotional.&lt;/p&gt;

&lt;p&gt;That’s not what I’m exploring.&lt;/p&gt;

&lt;p&gt;For me, emotional AI is about:&lt;/p&gt;

&lt;p&gt;Remembering preferences&lt;/p&gt;

&lt;p&gt;Maintaining interaction history&lt;/p&gt;

&lt;p&gt;Adapting tone and behavior over time&lt;/p&gt;

&lt;p&gt;It’s not about fake empathy.&lt;br&gt;
It’s about continuity.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;What I’ve learned so far (the hard parts)&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Memory is expensive — technically and ethically&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Storing memory isn’t just a database problem.&lt;br&gt;
You need to decide:&lt;/p&gt;

&lt;p&gt;What’s worth remembering?&lt;/p&gt;

&lt;p&gt;What should be forgotten?&lt;/p&gt;

&lt;p&gt;Who controls that memory?&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;“Personal” quickly becomes “creepy” if done wrong&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;There’s a very thin line between helpful continuity and overreach.&lt;br&gt;
Designing that boundary is more important than model choice.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Developers underestimate emotion in tools&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Many devs (myself included) initially think users only care about features.&lt;br&gt;
In reality, how a tool makes you feel over time strongly affects retention.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Why I’m sharing this early&lt;/strong&gt;&lt;br&gt;
This project is still in a tech-trial stage.&lt;br&gt;
I’m intentionally sharing before everything is “perfect”.&lt;/p&gt;

&lt;p&gt;Because the most valuable insights so far haven’t come from metrics —&lt;br&gt;
they’ve come from conversations.&lt;/p&gt;

&lt;p&gt;A question for builders here&lt;/p&gt;

&lt;p&gt;When you think about the tools you use daily:&lt;/p&gt;

&lt;p&gt;Do you value memory and continuity?&lt;/p&gt;

&lt;p&gt;Or do you prefer tools to stay stateless and predictable?&lt;/p&gt;

&lt;p&gt;*&lt;em&gt;Where do you personally draw the line?&lt;br&gt;
*&lt;/em&gt;&lt;br&gt;
I’d love to learn from real experiences, not just theory.&lt;/p&gt;

&lt;p&gt;Thanks for reading 🙏&lt;/p&gt;

</description>
      <category>ai</category>
      <category>saas</category>
      <category>productivity</category>
      <category>automation</category>
    </item>
  </channel>
</rss>
