DEV Community

Cover image for Rise of the Agentic "Strangler Fig" Strategy
Theo Ezell (webMethodMan)
Theo Ezell (webMethodMan)

Posted on

Rise of the Agentic "Strangler Fig" Strategy

TL;DR: Stop trying to rewrite your 15-year-old monolith from scratch. It won't work. Instead, use the Model Context Protocol (MCP) to wrap your legacy systems in AI Agents. It’s the "Strangler Fig" pattern on steroids. 🚀


Let’s be real for a second. The "Big Bang" rewrite is a lie we tell ourselves to sleep at night. 🛌

We’ve all been there. You stare at that "distributed monolith"—the calcified ball of mud running the core business—and you think, "If we just pause feature dev for 18 months, we can rewrite this in Rust/Go/Next.js and it will be perfect."

Spoiler Alert: It won't be.

The problem is the Code-to-Context Gap. Your legacy system contains decades of implicit business knowledge—edge cases, regulatory hacks, and weird bug fixes—that exist only in the code, not in Jira or Confluence. When you rewrite from scratch, you lose that context.

The smart alternative has always been Martin Fowler’s Strangler Fig pattern. You wrap the old system, intercept calls, and gradually route them to new microservices. It’s safer, but traditionally? It is painfully slow. 🐢

Building facades, mapping dependencies, and writing glue code is manual, soul-crushing work.

Enter the Agentic Strangler 🤖

Here is the new angle: Don't rewrite the backend yet. Wrap it with Agents.

By deploying AI agents equipped with Model Context Protocol (MCP) tools, you can implement a dynamic, intelligent Strangler Fig pattern that works at the speed of AI.

1. The Universal Adapter (MCP) 🔌
If you haven't looked at the Model Context Protocol (MCP) yet, you need to. Think of it as the "USB-C for AI".

It solves the N x M connectivity problem. Instead of building bespoke API wrappers for every legacy database and service, you build an MCP Server once. Unlike a stateless REST API, an MCP interface maintains context. It allows an AI agent to understand "User X" across a session, querying your 1980s mainframe and your 2024 cloud SQL database in the same breath.

2. The Agent as the Dynamic Facade 🎭
In the old world, your "facade" was a static API Gateway (like Kong or Apigee) with hardcoded routing rules. In the new world, the Agent is the facade.

An MCP-enabled agent receives a high-level intent:

"Process a refund for User X."

It checks its tools. 🛠️

It sees legacy_inventory_update (Monolith) and refund_service_v2 (New Microservice).

It intelligently chains them together.

You don't need to manually configure routes. The agent discovers the capabilities dynamically. If you deploy a new microservice tomorrow, the agent sees the new tool and starts using it automatically.


The Protocol of Doing: How to Build It 🛠️

Enough theory. How do we actually ship this?

Step 1: Design for Intent (Anti-Pattern Alert! 🚨)
Do not map your legacy APIs 1:1 to MCP tools. That is a massive anti-pattern . Legacy APIs are full of junk parameters and weird auth flows that will confuse an LLM.

❌ Bad (1:1 Mapping):

JSON
{
  "name": "get_customer_data",
  "description": "Calls GET /api/v1/cust",
  "parameters": {
    "id": "string",
    "include_orders": "boolean",
    "sort": "string",
    "legacy_flag_23": "boolean" 
  }
}
Enter fullscreen mode Exit fullscreen mode

✅ Good (Intent-Based):

Python
@mcp.tool()
def get_customer_dashboard(email: str) -> str:
    """
    Retrieves a holistic view of the customer, aggregating 
    data from the legacy CRM and the new Order Service.
    """
    # The complexity is hidden here!
    legacy_data = legacy_crm.lookup(email)
    new_data = order_service.get_latest(email)
    return format_dashboard(legacy_data, new_data)
Enter fullscreen mode Exit fullscreen mode

Encapsulate the complexity inside the MCP server. Give the agent high-level "Business Intent" tools.

Step 2: Port Isolation via Proxying 🛡️
Don't expose your messy legacy ports directly. Use an MCP Proxy pattern (similar to what Microsoft is doing with Aspire ).

  • The Proxy sits between the Agent and your infrastructure.
  • It handles dynamic port allocation.
  • The Agent sees a clean, consistent interface via stdio.

Step 3: The "Shadow Write" Pattern 👻
How do you migrate data without breaking production? Use the agent for Shadow Writes.

  • Agent receives "Create Order" request.
  • Agent writes to the New Microservice database.
  • In parallel, it writes to the Legacy Monolith via the MCP wrapper.

It compares the results.

Match? ✅ Great.

Mismatch? ❌ Log the diff for engineers to fix.

This lets you validate the new system with live production traffic without risking data integrity.


Real World Wins 🏆

Insurance: Companies like Sure are seeing a 95% faster quote-to-bind time. Their agents don't just chat; they use MCP tools to autonomously file claims in the legacy backend.

Supply Chain: Boomi is using this to unify fragmented WMS and ERP systems. Agents act as the glue, checking inventory across siloed systems and triggering restocking workflows that no single legacy app could handle alone.

The Verdict

The Agentic Strangler decouples your Technical Debt from your Business Value.

In a traditional migration, you have to pay down the debt (rewrite the code) before you get the value (new features). With MCP Agents, you get the value first. You deploy the agents, wrap the legacy data, and start shipping modern features immediately.

Be an integrator. Wrap the monolith, strangle the legacy, and let the agents do the work.


📚 Further Reading
Migrating legacy services to a modern developer portal: A technical guide to Backstage integration

Why MCP Shouldn’t Wrap an API One-to-One

Scaling AI Agents with Aspire: The Missing Isolation Layer for Parallel Development

The Protocol of Doing


Got a monolith horror story? Drop it in the comments. Let's commiserate. 👇

Top comments (0)