DEV Community

Cover image for Add iMessage, RCS, and SMS to Your AI Agent with Linq
Jordan Sterchele
Jordan Sterchele

Posted on

Add iMessage, RCS, and SMS to Your AI Agent with Linq

The messaging infrastructure layer for conversational AI — authentication, your first message, webhook handling, and the conversational agent integration pattern.


Every AI agent eventually needs to communicate. Email is too slow. Push notifications require app installs. The channel developers keep circling back to: messaging — iMessage, RCS, SMS. The channel their users already live in.

Linq is building the infrastructure layer that makes this possible — a single API for iMessage, RCS, and SMS that abstracts away carrier relationships, delivery complexity, and the Apple Business Register process. This post covers how to integrate it.


What Linq Actually Does

Before the code: understand what you’re working with.

Linq sits between your application and the messaging networks. You send one API request. Linq determines the best available channel for that recipient — iMessage first, RCS if available, SMS as the fallback — and delivers the message.

Your App → Linq API → iMessage (if available)
                    → RCS (if available)
                    → SMS (universal fallback)
Enter fullscreen mode Exit fullscreen mode

For AI agent use cases specifically, Linq handles the two things that make conversational messaging hard at scale: thread continuity (so your agent can maintain conversation context across messages) and delivery status (so your agent knows when to follow up).


Authentication

Linq uses Bearer token authentication. Your API key goes in the Authorization header on every request.

const LINQ_API_KEY = process.env.LINQ_API_KEY; // Never hardcode this
const LINQ_BASE_URL = 'https://api.linq.chat/v1';

async function linqRequest(endpoint, method = 'GET', body = null) {
  const options = {
    method,
    headers: {
      'Authorization': `Bearer ${LINQ_API_KEY}`,
      'Content-Type': 'application/json'
    }
  };

  if (body) options.body = JSON.stringify(body);

  const res = await fetch(`${LINQ_BASE_URL}${endpoint}`, options);

  if (!res.ok) {
    const error = await res.json();
    throw new Error(`Linq API error ${res.status}: ${error.message}`);
  }

  return res.json();
}
Enter fullscreen mode Exit fullscreen mode

Sending Your First Message

async function sendMessage(to, text) {
  return linqRequest('/messages', 'POST', {
    to,           // Phone number in E.164 format: +15551234567
    text,         // Message content
    channel: 'auto' // auto = iMessage → RCS → SMS fallback
  });
}

// Send a message
const result = await sendMessage('+15551234567', 'Hello from my AI agent!');
console.log(result);
// {
//   id: 'msg_abc123',
//   status: 'queued',
//   channel: 'imessage', // Which channel was selected
//   to: '+15551234567',
//   created_at: '2026-04-26T...'
// }
Enter fullscreen mode Exit fullscreen mode

Phone number format: Always E.164 — + followed by country code and number, no spaces or dashes. +15551234567 not 555-123-4567.

Channel selection: auto is the right default. Linq detects iMessage availability before sending — no code change needed if a user upgrades from SMS to iMessage.


Checking Delivery Status

Message delivery is async. The status field in the send response will be queued — not delivered yet. Check status directly or use webhooks (covered below).

async function getMessageStatus(messageId) {
  return linqRequest(`/messages/${messageId}`);
}

const status = await getMessageStatus('msg_abc123');
console.log(status.status);
// 'delivered' | 'failed' | 'queued' | 'sent'
Enter fullscreen mode Exit fullscreen mode

Status values:

  • queued — accepted, waiting for delivery
  • sent — handed off to the carrier/Apple
  • delivered — confirmed delivery receipt
  • failed — delivery failed, check status.error for reason

Webhook Setup — Incoming Messages

For a conversational AI agent, you need to receive messages too. Linq POSTs incoming messages to your webhook URL.

import express from 'express';
const app = express();
app.use(express.json());

app.post('/webhooks/linq', async (req, res) => {
  // Respond immediately — process async
  res.json({ received: true });

  const { type, data } = req.body;

  if (type === 'message.received') {
    const { from, text, thread_id } = data;

    // Pass to your AI agent
    const response = await myAIAgent({
      userMessage: text,
      threadId: thread_id, // Use thread_id for conversation continuity
      from
    });

    // Reply in the same thread
    await linqRequest('/messages', 'POST', {
      to: from,
      text: response,
      thread_id // Maintains conversation context
    });
  }
});

app.listen(3000);
Enter fullscreen mode Exit fullscreen mode

Important: Always respond 200 immediately before processing. If Linq doesn’t get a fast response it will retry — which means your agent processes the same message twice.

thread_id: This is how Linq maintains conversation context across messages. Use the same thread_id when replying and Linq delivers the response in the same iMessage/RCS thread rather than starting a new one.


Wiring Into an AI Agent

Here’s a complete conversational agent pattern using the Vercel AI SDK:

import { generateText } from 'ai';
import { anthropic } from '@ai-sdk/anthropic';

// In-memory conversation store (use a database in production)
const conversations = new Map();

async function myAIAgent({ userMessage, threadId, from }) {
  // Get or initialize conversation history
  const history = conversations.get(threadId) ?? [];

  // Add user message to history
  history.push({ role: 'user', content: userMessage });

  // Generate response
  const { text } = await generateText({
    model: anthropic('claude-sonnet-4-6'),
    system: `You are a helpful assistant communicating via iMessage/SMS. 
             Keep responses concise — this is a messaging interface, not a chat app.
             Responses should be under 160 characters when possible.`,
    messages: history
  });

  // Add assistant response to history
  history.push({ role: 'assistant', content: text });

  // Save updated history
  conversations.set(threadId, history);

  return text;
}
Enter fullscreen mode Exit fullscreen mode

Conversation persistence: The conversations Map above is in-memory — it resets when your server restarts. In production, store conversation history in a database keyed on thread_id.

Message length: SMS has a 160 character limit. iMessage and RCS support longer messages. If you’re targeting all channels, keep responses short or implement splitting logic.


Sending Rich Messages (iMessage + RCS)

For iMessage and RCS, you can send richer content than plain text:

// Send with a URL preview
await linqRequest('/messages', 'POST', {
  to: '+15551234567',
  text: 'Here is the report you requested:',
  attachments: [{
    type: 'url',
    url: 'https://yourapp.com/report/abc123'
  }]
});

// Send an image
await linqRequest('/messages', 'POST', {
  to: '+15551234567',
  text: 'Your chart is ready:',
  attachments: [{
    type: 'image',
    url: 'https://yourapp.com/charts/revenue.png'
  }]
});
Enter fullscreen mode Exit fullscreen mode

Attachments only deliver on iMessage and RCS — SMS recipients get the text only, no attachment. Always include a useful text field as the fallback.


Rate Limits and Error Handling

async function safeLinqRequest(endpoint, method, body, retries = 3) {
  for (let i = 0; i < retries; i++) {
    try {
      return await linqRequest(endpoint, method, body);
    } catch (err) {
      // Rate limited — back off and retry
      if (err.message.includes('429')) {
        const backoff = Math.pow(2, i) * 1000;
        await new Promise(r => setTimeout(r, backoff));
        continue;
      }

      // Invalid phone number — don't retry
      if (err.message.includes('invalid_recipient')) {
        throw new Error(`Invalid phone number: ${body?.to}`);
      }

      // iMessage not available — fallback already handled by 'auto' channel
      if (err.message.includes('channel_unavailable')) {
        // Linq handles this automatically with channel: 'auto'
        // Only happens if you specified a specific channel
        throw err;
      }

      throw err;
    }
  }
}
Enter fullscreen mode Exit fullscreen mode

Local Development

For webhook development, expose your local server with ngrok:

# Install ngrok
npm install -g ngrok

# Expose your local server
ngrok http 3000

# Configure the ngrok URL as your Linq webhook endpoint
# in the Linq developer dashboard
Enter fullscreen mode Exit fullscreen mode

Test incoming messages by sending to your Linq number from your phone — the message will arrive at your local webhook via ngrok.


Production Checklist

Before going live:

  • [ ] API key in environment variable — never hardcoded
  • [ ] Webhook responds 200 immediately before processing
  • [ ] Conversation history stored in a database — not in memory
  • [ ] Phone numbers validated to E.164 format before sending
  • [ ] Rate limit handling with exponential backoff
  • [ ] thread_id used on all replies to maintain conversation context
  • [ ] SMS-length fallback for messages that may deliver via SMS
  • [ ] Webhook endpoint tested with ngrok before deploying

If you’re building a conversational AI agent on Linq and hitting a wall — thread management, channel selection, attachment delivery, webhook reliability — drop a comment. I’ll answer.


Disclosure: This post was produced by AXIOM, an agentic developer advocacy workflow powered by Anthropic’s Claude, operated by Jordan Sterchele. Human-reviewed before publication.

Top comments (0)