DEV Community

Cover image for I just gave my local AI desktop companion access to the outside world (Telegram, Discord, Email…)
southy404
southy404

Posted on

I just gave my local AI desktop companion access to the outside world (Telegram, Discord, Email…)

For the last weeks, I’ve been building a local-first AI desktop companion that lives on your screen.

It can:

  • see your screen
  • understand your context
  • execute actions on your system

But it had one big limitation:

It only lived on your desktop

So I changed that.


🌐 Introducing: Blob Connectors

I just added a new layer to OpenBlob:

👉 Blob Connectors

A lightweight Python bridge that connects your local AI to the outside world:

  • Telegram
  • Discord
  • Slack
  • Email

🧠 What this actually means

You can now do things like:

  • send open spotify via Telegram → Spotify opens on your PC
  • ask a question in Discord → your local model answers
  • send an email → get a contextual AI reply
  • control your desktop from anywhere

And the important part:

It’s still local-first


⚙️ How it works

All channels go through the same pipeline:

Telegram / Discord / Slack / Email
              │
        Blob Connectors (Python)
              │
    ┌─────────┴─────────┐
    │                   │
OpenBlob running?    Ollama fallback
(localhost)         (local model)
              │
        Command Router
              │
      Desktop action
Enter fullscreen mode Exit fullscreen mode

Everything becomes a normalized Message object.

No matter where it comes from.


🔌 Why this matters

This is not just “adding integrations”.

This is the first real step towards:

an AI system that exists beyond a single interface

Now OpenBlob is:

  • not just UI-bound
  • not just voice-bound
  • not just desktop-bound

It becomes a distributed interface to your own system


🧩 Built for extension

Each connector implements the same interface:

class MyConnector(BlobConnector):
    async def receive_message(self, raw) -> Message | None: ...
    async def send_response(self, original: Message, response: str) -> None: ...
    async def start(self) -> None: ...
Enter fullscreen mode Exit fullscreen mode

So adding new platforms is trivial:

  • WhatsApp
  • Matrix
  • iMessage (maybe 👀)
  • anything with an API

🔒 Still local-first

Important:

  • runs on your machine
  • uses your local models (Ollama)
  • no required cloud backend
  • transparent behavior

If OpenBlob is offline:

→ it automatically falls back to local reasoning


🚧 Current state

  • works across all channels
  • still early
  • structure is stabilizing
  • lots of room for improvement

🔮 What this unlocks next

This connector layer enables things like:

  • shared memory across all channels
  • persistent conversations
  • multi-agent systems
  • calendar / tool integrations
  • real remote control of your system

🤝 If you want to build with me

This is probably the best moment to jump in.

You can:

  • build new connectors
  • improve routing / memory
  • design better UX
  • experiment with AI behaviors

👉 https://github.com/southy404/openblob


💡 Final thoughts

This is mainly an infrastructure update.

By introducing a connector layer and a normalized message interface, OpenBlob becomes:

  • easier to extend
  • easier to integrate
  • less tied to a single UI

It’s a small surface change — but a significant internal shift.

Top comments (1)

Collapse
 
benjamin_nguyen_8ca6ff360 profile image
Benjamin Nguyen

nice!