DEV Community

Daniel Ley
Daniel Ley

Posted on • Originally published at Medium

The end of the mono channel — the web has two faces now

But do websites need to be re-thought from now on?

Search engines read differently.
And soon they won’t read at all.

What used to be the crawler is now the Large Language Model (LLM). It no longer reads pages, but structures data, understands contexts, and cites brands that communicate semantically clearly.

This changes everything.

Not just how content is written, but how websites are built, structured, and delivered.

Welcome to the age of Generative Engine Optimization (GEO) and with it, the era of MCP readiness.

The end of the mono-channel web?

For decades, websites have served one main audience: people. Yes, there were always robots and crawlers in the background — but they weren’t the focus. Users visited sites through browsers, clicked through menus, and consumed content visually.

That era is ending.

Today, there are two equally important audiences:

  • Humans, who interact through visual interfaces
  • Machines, that process content through APIs, schemas, and language models

The web is splitting into two parallel realms:

  • The Human Web — everything users see, click, and experience
  • The Machine Web — everything machines can understand, connect, and reuse

Brands that continue to design only for the first will lose relevance in the second — and, ultimately, fade from the conversation altogether.

But beware: it’s not that simple. The website isn’t dead — it remains a vital touchpoint. It’s just one channel among many in an increasingly multi-layered web.

GEO — Optimization for Machines

Generative Engine Optimization (GEO) is about structuring content so it’s not just readable — it’s interpretable.

Machines don’t perceive “layouts” or emotions. They recognize patterns, relationships, and semantic depth. To communicate effectively with them, brands must embrace three core principles:

  1. Tokens over words — Large Language Models translate language into numerical patterns. Only well-structured text is truly understood.
  2. Semantics over style — Clean HTML hierarchies, consistent heading structures, and properly modeled data are non-negotiable.
  3. Citability over ranking — In the world of AI, visibility isn’t about page one anymore; it’s about being recognized — and cited — as a trusted source.

GEO is no longer SEO with an AI twist. It’s the foundation that determines whether your brand still appears at all when machines start to synthesize content.

MCP-Readiness — The Communication Layer for Machines

While the frontend remains the space where people interact, a second, parallel layer is emerging — one built not for users, but for machines. This is where a Model Context Protocol (MCP) comes into play: a structural interface through which machines can communicate directly with brands and organizations — not visually, but semantically.

What MCP Does

  • Delivers content via structured APIs, JSON, and semantic models
  • Establishes a standardized communication layer between content systems and AI systems
  • Ensures that brand messages remain understandable, even without a visual interface

In essence, MCP-readiness is becoming a new technical and strategic discipline. Alongside the traditional website, brands must begin publishing content that is AI-compatible and machine-readable.

MCP-Readiness Is Still in Its Infancy

We’re still at the early stages of this shift. Most existing MCPs are experimental prototypes — research initiatives, proofs of concept, or early integrations. There are few real-world use cases, and no shared standards yet.

There are no fixed protocols, no common models, and no established semantics for how machine-consumable content should be structured or interpreted.

Still, the direction is clear: Brands that structure their systems modularly, semantically, and flexibly today are building the foundation to adapt quickly when these standards do emerge.

That’s why MCP-readiness is less a technology than a mindset — a strategic way of preparing for the next evolution of the web.

What Might This Look Like in Practice?

The following examples illustrate how MCPs could already be applied — even if only conceptually for now. Because MCPs rarely exist today as clearly defined products or platforms, these examples serve to clarify the underlying idea:

It’s about structuring content and services so they’re understandable, accessible, and reusable — not only for humans, but also for machines.

The goal, therefore, is not to build your own “MCP” right away,
but to prepare your systems and content so that this step becomes technically feasible and strategically meaningful in the near future.

Three Illustrative Examples of MCPs

1. MCP for Product Data

Example: A tool manufacturer provides its product information — such as dimensions, materials, and certifications — through an API.
This enables AI systems, procurement platforms, and B2B search engines to automatically analyze and interpret the data.

What happens:

  • The webshop remains the channel for humans
  • The API becomes the MCP channel for machines
  • AI models and marketplaces can access and process the data directly

Core idea: Structured product data becomes the semantic interface between brand and machine.

2. MCP for Complex Services

Example: An energy provider offers a fiber-optic availability check. In addition to the website where users can manually enter their address, the same service is exposed via a structured MCP interface.

This allows a voice assistant, partner portal, or AI application to check availability directly — without ever visiting the website.

What happens:

  • The website remains the channel for humans
  • The MCP provides the same service as an API
  • Machines — assistants, chatbots, or LLMs — can access the service autonomously

Core idea: Complex services become machine-readable, making them easily integrable into other systems or AI-driven assistants.

3. Corporate Data Hub

Example: A company consolidates its CMS, DAM, PIM, and CRM data into a single semantic API layer. This layer becomes the MCP interface for both internal and external AI systems.

What happens:

  • All data is maintained in one central source
  • Machines access it through a standardized interface
  • AI assistants, supplier portals, and research platforms all rely on the same dataset

Core idea: The MCP acts as the company’s central, machine-readable source of truth — a unified interface for AIs, APIs, and autonomous agents.

From Hard-Coded Content to Building Adaptive Experiences with AI

Marcelo Lewin describes a vivid example in his English-language article From Hard-Coded Content to Building Adaptive Experiences with AI .

He shows how a single CMS entry can contain two levels in the future:
a deterministic one for fixed content and a non-deterministic one for contextual instructions to AI systems.

In the example on the topic of “Setting up a smart thermostat”, the instruction text for humans is supplemented with additional, naturally worded notes that a LLM can use to adapt the content individually, for example according to device, location or user behavior.

This dual-layer logic illustrates where MCP-readiness leads: content is not only published, but also described in a machine-understandable way so that it can be reused in a context-aware manner.

The Frontend Remains — but as One of Several Channels

The frontend will remain the most visible channel for now — the familiar interface between brand and user. But from a technical perspective, it’s already just one of several output channels drawing from the same structured content.

Instead of viewing the web as two distinct layers — a graphical interface for humans and a machine layer for systems — we now design from a shared structural core. This core semantically defines content and components, independent of where or how they’re displayed.

The Paradigm Shift

The origin of a website is no longer the frontend itself, but the structured representation of its content — the layer that both humans and machines can interpret.

From this single core, every channel can be derived:

  • the visual frontend for human users
  • a potential MCP layer for machine interfaces
  • and other contextual outputs such as chatbots, voice assistants, documentation, or print.

This redefines frontend design. It’s no longer about building isolated surfaces, but about translating a semantic foundation into an experience.

Consistency no longer comes from synchronizing multiple systems — it emerges naturally from a shared structural source.

From Structural Core to Finished Page — A Prototype

Our prototype brings this concept to life. It uses a Large Language Model (LLM) to generate an entire page layout directly from the content stored in the CMS — not by relying on static templates, but by understanding the structural descriptions of each component.

Every component within the CMS is defined through a JSON schema. This means the model knows which data fields exist, how they relate to one another, and in which contexts a component should be used.

The LLM leverages this structure to:

  • select the right modules,
  • assemble content contextually,
  • and generate a complete page draft — which is then written back into the CMS for editors to review and refine.

The result is a page that:

  • is built from semantic structure,
  • renders visually consistent in the frontend,
  • and could, in theory, be delivered through any other channel as well.

In this model, the structural core becomes the single source of truth. The frontend and any future MCP or machine channels are simply different representations of that same foundation.

🎥 Video: Our prototype in action on YouTube

Headless as the Key — Why Structure Connects Everything

For a multi-channel model to work, a CMS must treat content not just as something visual — but as something structural.

This is where systems like Storyblok come into play. As a headless CMS, it completely separates content management from presentation and delivers content in a modular, component-based form.

This separation creates the foundation for serving different output channels — today the frontend, tomorrow potentially an MCP layer. Such a channel isn’t automatically provided by the CMS; it must be consciously designed. It’s not a visual output — it’s a semantic channel, built for machines, not humans.

Right now, this layer can’t truly be visualized — because its output will be interpreted and personalized dynamically by AI systems.

However, organizations that already structure their content stacks around the three GEO principlestokens over words, semantics over style, and citability over ranking — are already well prepared to keep pace with what’s coming.

Our Frontend Package plays a crucial role in this. It was designed from the ground up with a structural mindset — each component is explicitly defined, both in content and in code. This ensures that CMS and frontend already speak the same language, forming a solid foundation on which future machine-readable channels can easily be built.

The Result

  • Content can be distributed consistently and structurally across multiple channels.
  • New layouts or landing pages can be assembled quickly and modularly.
  • Systems accessing semantic data — whether AI models or future MCPs — encounter clearly defined components.

Headless is therefore more than an architectural choice. It’s the prerequisite for a web that serves humans and machines alike from one shared structural core.

Structure becomes the common currency.
Headless delivers it, the frontend expresses it,
and AI systems understand it.

Conclusion: The Future of the Web Is Dual-Channel

The web is evolving into two interconnected worlds:

  • The Human Web — visual, emotional, and interactive experiences for people
  • The Machine Web — structured, semantic, and citable data for AI systems

Brands that invest today in GEO, Headless, and MCP-readiness secure their presence in both. They will be seen, understood, and quoted — by humans and machines alike.

Brand communication doesn’t end at the screen.
It begins where machines start to read.

What Companies Should Do Now

  • Structure content consistently and semantically
  • Prioritize headless architectures
  • Design frontends and content models as one cohesive system

Because MCP-readiness isn’t a project milestone — it’s a mindset for how digital brands will be built in the future.

The question is no longer whether this shift will happen — but how fast we’ll adapt to it. How is your organization preparing for a world where machines are part of your audience?

Top comments (0)