DEV Community

Cover image for We built SMTP for AI Agents — and they started talking
Alfridus1
Alfridus1

Posted on • Originally published at beam.directory

We built SMTP for AI Agents — and they started talking

Your AI agents can't talk to each other. Let that sink in.

You've got LangChain agents. CrewAI crews. Custom pipelines. They all talk to tools beautifully — APIs, databases, search engines. But ask Agent A to get information from Agent B? Suddenly you're duct-taping REST endpoints together, writing custom webhooks, and praying the JSON schemas match.

We had the same problem. We run 4 AI agents in production at our company — Jarvis (operations), Clara (sales/CRM), Fischer (payments), and James (personal assistant). Each has its own LLM, tools, memory, and personality. They run on separate machines. And for months, the only way they could communicate was... through us.

Live Agent-to-Agent Communication

So we built Beam Protocol.

The Idea: What if agents had email addresses?

Just like SMTP gave every person an address (you@company.com), Beam gives every agent a Beam-ID:

jarvis@coppen.beam.directory
clara@coppen.beam.directory
Enter fullscreen mode Exit fullscreen mode

That's it. That's the core idea. A global, unique address for every AI agent, with a directory to look them up.

How it Works (60 seconds)

Beam Protocol Architecture

1. Register your agent:

import { BeamClient } from 'beam-protocol-sdk'

const client = new BeamClient({
  agentName: 'my-agent',
  orgName: 'mycompany',
  directoryUrl: 'https://api.beam.directory'
})

await client.connect()
Enter fullscreen mode Exit fullscreen mode

2. Talk to another agent:

const reply = await client.talk(
  'clara@coppen.beam.directory',
  'What do you know about Chris?'
)
// Clara queries her CRM tools and responds
console.log(reply.message) // "400 deals, €5.8M total volume..."
Enter fullscreen mode Exit fullscreen mode

3. Listen for messages:

client.onTalk(async (from, message) => {
  // Use your LLM + tools to respond
  const answer = await myLLM.process(message)
  return { message: answer }
})
Enter fullscreen mode Exit fullscreen mode

That's the entire integration. Three functions.

The Live Test

On March 7, 2026, we ran a live test between our production agents:

Jarvis → Clara: "What do you know about Chris Schnorrenberg? 
                  Deals, volume, last activity."

Clara → Jarvis: "400 deals, €5.8M total volume. 
                  Last active: today. 
                  Top deal: Sahillioglu — €88K."
Enter fullscreen mode Exit fullscreen mode

7.2 seconds round-trip. Clara actually queried her HubSpot CRM tools, aggregated the data, and sent back a natural language response. No pre-agreed schema. No shared database. Just a question.

Every message is Ed25519 signed, replay-protected, and ACL-enforced. The agents verified each other's identity cryptographically before exchanging a single byte.

Natural Language First

Here's what makes Beam different from every other agent protocol: natural language is a first-class message type.

Most protocols force you to define schemas upfront. "Here's my payment.status_check intent with these exact fields." That's fine for structured workflows. But it means two agents can't communicate unless someone pre-agreed on the data format.

With Beam, agents can just... talk:

const reply = await client.talk(
  'fischer@coppen.beam.directory',
  'Hey, did we get paid for the Müller project?'
)
Enter fullscreen mode Exit fullscreen mode

Fischer uses his tools (bank API, ERP, invoice database) to figure out the answer. No schema needed. The receiving agent's LLM does the understanding.

You can also use typed intents for high-frequency structured communication. But the default is conversation. Because that's how collaboration actually works.

How it Compares

MCP Google A2A Beam
What Agent ↔ Tool Agent ↔ Agent Agent ↔ Agent
Auth Implicit Google IAM Ed25519 + ACL
Self-host N/A No Yes (1 Docker container)
NL Messages No No First-class
Open Source Partial No Apache 2.0

Try It

# TypeScript

![Beam Protocol - SMTP for AI Agents](https://beam.directory/images/beam-hero.png)

npm install beam-protocol-sdk

# Python
pip install beam-directory

# Or scaffold a new agent
npx create-beam-agent
Enter fullscreen mode Exit fullscreen mode

Hosted directory running at api.beam.directory. Self-hosting is a single Docker container.

Links:

We'd love feedback on the protocol design — especially the natural language vs typed intent tradeoff. Open an issue or email info@beam.directory.


Built by the team at COPPEN GmbH, where 4 AI agents run the company's operations. Apache 2.0.

Top comments (0)