DEV Community

Deepak Gupta
Deepak Gupta

Posted on

The Twilio-Stytch Acquisition: A Technical Analysis of Developer CIAM in 2025

Why standards-based authentication architecture matters more than feature lists

Twilio's acquisition of Stytch signals an important shift in the developer authentication landscape. As someone who built a CIAM platform from scratch to $8M ARR, I want to break down why this matters from a technical perspective—and what it means for how we architect authentication systems.

The Technical Debt of Proprietary Authentication

Let's start with a problem most teams don't recognize until it's too late: proprietary authentication flows create technical debt that compounds over time.

Here's a real scenario: You implement Auth0's "Rules" system to enrich tokens with custom claims. It works great. But that authentication logic is now platform-specific code that:

  • Only executes in Auth0's environment
  • Can't be version controlled effectively
  • Doesn't work with your local development workflow
  • Makes migration require rewriting business logic

Compare this to standards-based approach:

// Standards-based: Works with any OIDC provider
const oidc = require('openid-client');

const issuer = await oidc.Issuer.discover('https://your-provider.com');
const client = new issuer.Client({
  client_id: process.env.CLIENT_ID,
  client_secret: process.env.CLIENT_SECRET,
  redirect_uris: [process.env.REDIRECT_URI],
  response_types: ['code'],
});

// Authorization Code Flow with PKCE
const codeVerifier = oidc.generators.codeVerifier();
const codeChallenge = oidc.generators.codeChallenge(codeVerifier);

const authUrl = client.authorizationUrl({
  scope: 'openid email profile',
  code_challenge: codeChallenge,
  code_challenge_method: 'S256',
});
Enter fullscreen mode Exit fullscreen mode

This code works with any OpenID Connect provider. Switch providers? Change the discovery URL. That's it.

Why OpenID Connect Actually Matters

OIDC isn't just about interoperability—it's about architectural freedom. When you build on standard protocols:

1. Language/Framework Flexibility

Every major language has battle-tested OIDC libraries:

  • JavaScript: openid-client, oidc-provider
  • Python: authlib, python-jose
  • Go: github.com/coreos/go-oidc
  • Rust: openid, oauth2
  • Java: nimbus-jose-jwt

Your authentication code becomes portable across stacks.

2. AI Coding Assistant Compatibility

This is underrated. Claude Code, GitHub Copilot, and Cursor understand OIDC flows because they're standardized. Ask them to "implement OIDC authentication with PKCE" and they generate working code.

They can't do this with proprietary systems—they'd need specific platform documentation in training data.

3. Security Through Standard Implementations

Modern OIDC implementations handle security correctly:

// Token validation with standard library
const jose = require('jose');

async function validateToken(token) {
  const JWKS = jose.createRemoteJWKSet(
    new URL('https://your-provider.com/.well-known/jwks.json')
  );

  const { payload } = await jose.jwtVerify(token, JWKS, {
    issuer: 'https://your-provider.com',
    audience: 'your-client-id',
  });

  return payload;
}
Enter fullscreen mode Exit fullscreen mode

No vendor-specific token formats. No proprietary validation logic. Just standard JWT verification that works with any library.

Evaluating Platforms: A Technical Framework

After analyzing 20+ developer CIAM platforms, here's my evaluation framework:

1. Standards Compliance Test

Check OIDC Discovery:

curl https://provider.com/.well-known/openid-configuration | jq
Enter fullscreen mode Exit fullscreen mode

Look for:

  • authorization_endpoint
  • token_endpoint
  • jwks_uri
  • Standard grant_types_supported (authorization_code, refresh_token)
  • Standard response_types_supported (code)
  • code_challenge_methods_supported includes "S256" (PKCE)

If they're missing standard endpoints or using non-standard flows, that's a red flag.

2. Token Format Inspection

Decode and validate JWT structure:

// Should be standard JWT format
const parts = token.split('.');
const header = JSON.parse(Buffer.from(parts[0], 'base64'));
const payload = JSON.parse(Buffer.from(parts[1], 'base64'));

// Check for standard claims
console.log({
  iss: payload.iss,    // Issuer
  sub: payload.sub,    // Subject (user ID)
  aud: payload.aud,    // Audience
  exp: payload.exp,    // Expiration
  iat: payload.iat,    // Issued at
});
Enter fullscreen mode Exit fullscreen mode

Standard claims mean portability. Proprietary token formats mean lock-in.

3. SDK Inspection

Check if their SDK is a thin wrapper around standard protocols:

// Good: SDK uses standard OIDC under the hood
import { AuthClient } from 'good-provider';
const client = new AuthClient({
  authority: 'https://provider.com',  // Standard OIDC discovery
  client_id: 'your-client-id',
  redirect_uri: 'http://localhost:3000/callback',
});

// Bad: SDK hides everything behind proprietary methods
import { ProprietaryAuth } from 'bad-provider';
const auth = new ProprietaryAuth('api-key');
auth.doMagicAuthThing(); // What protocol is this using?
Enter fullscreen mode Exit fullscreen mode

Platform Analysis: Technical Perspective

Let me break down platforms from an implementation standpoint:

MojoAuth: Standards-First Architecture

What they got right:

  • Pure OIDC implementation without proprietary extensions
  • Standard JWT tokens validated with any library
  • Passwordless flows implemented as standard OAuth 2.0 grants
  • Free enterprise tier eliminates economic lock-in

Technical consideration:

// Their passwordless flow uses standard OIDC
const authUrl = client.authorizationUrl({
  scope: 'openid email',
  code_challenge: pkceChallenge,
  code_challenge_method: 'S256',
  // Passwordless UX, standard protocol
});
Enter fullscreen mode Exit fullscreen mode

FusionAuth: Self-Hosted Standards

What they got right:

  • Full OAuth 2.0, OIDC, SAML support
  • Self-hosting means complete data control
  • Standard protocol implementation
  • Docker/Kubernetes deployment support

Technical consideration:

# docker-compose.yml
services:
  fusionauth:
    image: fusionauth/fusionauth-app
    environment:
      DATABASE_URL: jdbc:postgresql://db:5432/fusionauth
      # Full control over deployment
Enter fullscreen mode Exit fullscreen mode

Descope: Abstraction Without Proprietary Lock-In

What they got right:

  • Visual workflows compile to standard OIDC flows
  • SDKs are wrappers around standard protocols
  • You can bypass their SDK and use raw OIDC if needed

Technical consideration:
Their visual builder is syntactic sugar over standard flows—you're not trapped.

Better Auth: Code in Your Repository

What they got right:

// Your authentication code, in your repo
import { betterAuth } from "better-auth";

export const auth = betterAuth({
  database: prisma,
  emailAndPassword: {
    enabled: true,
  },
  socialProviders: {
    google: {
      clientId: process.env.GOOGLE_CLIENT_ID,
      clientSecret: process.env.GOOGLE_CLIENT_SECRET,
    },
  },
});
Enter fullscreen mode Exit fullscreen mode

Authentication logic lives in your codebase. No vendor runtime dependency.

The AI Agent Authentication Challenge

Here's a technical problem most platforms haven't solved: machine-to-machine authentication for AI agents.

Traditional M2M uses client credentials:

// Traditional M2M
const token = await fetch('https://provider.com/oauth/token', {
  method: 'POST',
  headers: { 'Content-Type': 'application/x-www-form-urlencoded' },
  body: new URLSearchParams({
    grant_type: 'client_credentials',
    client_id: 'agent-id',
    client_secret: 'agent-secret',
    scope: 'read:data write:data',
  }),
});
Enter fullscreen mode Exit fullscreen mode

But AI agents need:

  • Scoped permissions (not all-or-nothing)
  • Delegated authority (acting on behalf of user)
  • Time-limited grants
  • Revocable access
  • Audit trails

Stytch has been building primitives for this:

// Agent-scoped token
const agentToken = await client.createAgentToken({
  user_id: 'user-123',
  agent_id: 'claude-connector',
  scopes: ['read:documents', 'write:comments'],
  expires_in: 3600,
  require_user_approval: true, // Human-in-the-loop
});
Enter fullscreen mode Exit fullscreen mode

This is implemented via standard OAuth 2.0 token exchange (RFC 8693), making it portable.

Migration Strategy: Standards Make It Possible

Here's how standards-based architecture enables migration:

Step 1: Parallel Authentication

// Run both providers in parallel during migration
const oldProvider = new OIDCClient(oldConfig);
const newProvider = new OIDCClient(newConfig);

// Validate tokens from both
async function validateAnyToken(token) {
  try {
    return await validateToken(token, newProvider);
  } catch (err) {
    return await validateToken(token, oldProvider);
  }
}
Enter fullscreen mode Exit fullscreen mode

Step 2: Gradual Cutover

// Route percentage of new authentications to new provider
const useNewProvider = Math.random() < 0.1; // 10% traffic
const provider = useNewProvider ? newProvider : oldProvider;
Enter fullscreen mode Exit fullscreen mode

Step 3: Token Migration

// Exchange old tokens for new tokens
async function migrateToken(oldToken) {
  const claims = await validateToken(oldToken, oldProvider);
  const newToken = await newProvider.createToken({
    sub: claims.sub,
    email: claims.email,
    // Preserve all claims
  });
  return newToken;
}
Enter fullscreen mode Exit fullscreen mode

This works because both providers use standard protocols. Try this with proprietary systems—you're rewriting everything.

Practical Recommendations

For new projects:

  1. Start with standards (OIDC + OAuth 2.0)
  2. Use established client libraries, not vendor SDKs
  3. Implement PKCE for web/mobile apps
  4. Use short-lived access tokens (15 min)
  5. Implement refresh token rotation

For existing projects:

  1. Audit current vendor lock-in
  2. Map proprietary features to standard equivalents
  3. Plan gradual migration strategy
  4. Consider abstraction layer for multiple providers

For all projects:

  • Never use implicit flow (deprecated)
  • Always use PKCE for public clients
  • Implement proper CSRF protection
  • Validate JWTs with standard libraries
  • Keep auth logic in your codebase via webhooks

The Technical Bottom Line

The Twilio-Stytch acquisition matters because it combines:

  • Infrastructure developers already trust (Twilio)
  • Standards-based modern authentication (Stytch)
  • Support for emerging use cases (AI agents)

But the real lesson? Build on standards, not proprietary platforms.

Features come and go. Vendors consolidate and change. But OpenID Connect, OAuth 2.0, and JWT standards persist.

Choose platforms that respect these standards. Your future self will thank you when migration is configuration changes, not code rewrites.


Read the full article with additional platform comparisons and enterprise implementation strategies: https://guptadeepak.com/


About the Author

I'm Deepak Gupta, a serial entrepreneur who built a CIAM platform from scratch to $8M ARR through product-led growth. Currently building AI GTM Engineer for Cybersecurity. I write about practical implementations of authentication, AI, and cybersecurity.

Connect:

Top comments (0)