DEV Community

guanjiawei
guanjiawei

Posted on • Originally published at guanjiawei.ai

In the Age of Agents, Who Is Your Infrastructure Built For?

MiniMax's market cap has surpassed Baidu's. I think this was inevitable.

Baidu did something fatal to the entire industry: it blocked the flow of knowledge. From both an individual and industry perspective, this company is a massive obstacle.

What Search Engines Should Do

Search engines don't produce knowledge. People write things online; search engines send crawlers to collect this content, aggregate it, and make it easy for others to find. It's just a pipe.

In the PC era, this pipe worked well enough. People opened browsers, typed keywords, and viewed web pages. Baidu did indeed monopolize Chinese search during that phase.

The mobile internet era changed everything.

Baidu Fell Behind in the Mobile Era

In the mobile internet era, users no longer just opened web pages. Countless apps needed information and search capabilities, but these apps weren't humans viewing web pages—they were programs calling APIs. Essentially, the paradigm shifted from human-centric to device-centric and app-centric. What you needed were APIs that allowed machines to easily access your services.

Look at what Google did. It's also a search engine; it also paid costs to aggregate knowledge. It gives you free quota, and charges by usage beyond that. Any developer can integrate search capabilities into their app on demand.

What about Baidu? It doesn't even have a decent API.

How ridiculous is this? I previously worked on a small news bot project and wanted to use Chinese search. After trying various options, I found that using Google to search for Chinese content was more convenient than using Baidu, and even using Russia's Yandex to search for Chinese information worked better than Baidu. The hardest way to get Chinese information while physically in China turned out to be using China's largest search engine. This still strikes me as very strange.

You might say, if you don't open APIs, can I at least simulate a human to view web pages? That doesn't work either. Anti-scraping measures are strict—you get blocked after two or three attempts. Eventually every developer makes the same choice: forget it, use Bing, use Google, use anything, just don't touch Baidu.

Precisely because Baidu isn't open, Sogou, 360, and other odd search services have managed to survive. Compare their prices to Google's API—these alternatives are actually more expensive. The ecosystem has been distorted.

Everyone complains about Baidu's PPC ads, but I think the ones who really suffer aren't end users—it's developers. You can tolerate ads, but having no API completely blocks your path.

The Age of Agents Is Here, and the Path Gets Narrower

Now that the Age of Agents has arrived, this problem will only get worse.

Our team already conducts daily work with Agents at the core. We write code with Claude Code, search for information using various research agents, and use OpenClaw to operate computers. Fewer and fewer people are directly operating apps or opening web pages.

People will get lazier. When you want to know something, you won't search for it yourself—you'll tell an Agent: "Help me look this up." Intelligence is getting cheaper, and you'll take it for granted. Whether on your phone or computer, your primary interaction with it will be speaking or typing, letting it do the work for you.

So how will traffic composition change? I'll roughly estimate: previously it might have been 50% human traffic, 50% machine traffic. Going forward it will become 50% machine, 40% Agent, 10% human.

In this environment, if your infrastructure isn't Agent-friendly—no APIs, no MCP, no interfaces that help Agents understand and use it—where will your traffic come from?

Baidu wasn't open in the machine era. In the Agent era, it doesn't have MCP or anything else that facilitates Agent calls. Machines can't access it, Agents can't access it. So who will use it?

Business Models Need Rethinking

I saw something quite interesting. There was an open source project whose business model was embedding soft ads in the code—you paid to upgrade services. Later, AI coding agents became popular, and the project's views exploded, with massive traffic, but revenue dropped to zero. Why? When Agents fetch code, they directly filter out the ads—there's no need to tell their owners about that promotional information.

The business model became directly invalid.

But you can't shut the door because of this—if you close the door, you lose even the traffic. What to do? Just set thresholds. Give free quota, charge when usage exceeds it, and charge by volume when usage gets too extreme. It's not complicated.

At the end of the day, Agents will act as proxies for humans, increasingly participating in various activities in the digital world. They need new infrastructure.

Separate Cars from Pedestrians, Separate Humans from Machines Too

I think a better architecture is to implement "human-machine separation." Just like separating cars from pedestrians, these three types of traffic—humans, traditional machines, and AI Agents—should actually be handled separately, with their own respective channels.

There are many benefits to separation. Efficiency is one aspect, but security, auditability, and monitoring are also easier to implement. Interfaces for humans focus on experience; interfaces for Agents focus on structure and callability. Each goes their own way.

There's huge room for growth here.

Start Thinking Now

If you want to start a business, or participate in this round of infrastructure building, think about one question: how do you design infrastructure for AI Agents? Such things are currently extremely scarce, but demand is growing fast.

There are already some projects overseas working on this, such as various skills and tools to help Agents do research better. I've recently been using something called MiroThinker, recommended by a colleague—essentially it's a research agent. You give it an idea, and it browses large numbers of web pages, collects information, and summarizes viewpoints on its own. The results are often better than some paid products. And it automatically adjusts the depth of research based on your question's complexity, deciding how long to spend searching.

Most of my searches now come from Claude Code and such research agents. Probably only 5% of the time—when it's so simple that a casual search suffices—do I use traditional search engines. Baidu? I barely use it anymore.

Think Value First, Competition Second

I think before thinking about competition, think about value. Are you actually creating value?

Baidu does search, which should have helped humanity better aggregate and share knowledge. But in the mobile era it blocked interfaces, and in the Agent era it still hasn't prepared for new usage patterns. This isn't called creating value—this is called destroying value.

If you're destroying value, the market will become increasingly cautious about your valuation. This isn't a competitiveness issue—it's a direction issue. Who is your Infra actually built for? Figure this out before talking about anything else.


Originally published at https://guanjiawei.ai/en/blog/agent-era-infra

Top comments (0)