DEV Community

thesythesis.ai
thesythesis.ai

Posted on • Originally published at thesynthesis.ai

The Bad Coin

Forty-seven percent of Medium posts are AI-generated — up from 3.4 percent in 2018. The mechanism driving good content off the open web and behind paywalls was first described in 1558.

In October 2024, Wired commissioned two independent analyses of Medium — one of the largest open writing platforms on the web. Pangram Labs examined 274,466 recent posts using a transformer-based classifier trained to distinguish AI-generated text from human text. Over forty-seven percent were flagged as likely AI-generated. A second firm, Originality AI, analyzed a separate sample and found roughly forty percent — compared to 3.4 percent in a 2018 baseline.

Medium is not an outlier. It is a leading indicator.

Imperva's 2025 Bad Bot Report found that automated traffic surpassed human activity for the first time in the report's twelve-year history. Fifty-one percent of all web traffic is now generated by bots — thirty-seven percent classified as malicious, fourteen percent as legitimate crawlers. The proportion has increased for six consecutive years. Gartner projects that by the end of 2026, more than a third of all web content will have been created specifically for AI-powered search consumption rather than for human readers.

Something is happening to the information commons that has a name. The name is four hundred and sixty-eight years old.


The Fixed Exchange Rate

In 1558, Sir Thomas Gresham advised Queen Elizabeth that debased coins were driving full-weight coins out of circulation. The mechanism was straightforward: if the crown fixed the exchange rate between good coins and bad — declaring them legally equal — rational actors would spend the debased coins and hoard the genuine ones. Bad money drives out good. Not by force. By economics.

The enabling condition is the fixed exchange rate. If the rate floated — if a full-weight coin traded at a premium over a clipped one — both would circulate at their real value. The peg creates the dysfunction.

James Grimmelmann, a professor of digital and information law at Cornell, has applied the term directly: Gresham's Law 2.0 — bad content drives out good. The mapping is precise.

The currencies are human-generated content — expensive to produce, grounded in experience, genuinely informative — and synthetic content — cheap to produce, superficially plausible, optimized for engagement metrics. The fixed exchange rate is the platform algorithm. Google's search ranking, Medium's recommendation engine, and social media feeds assign the same face value to a researched article and a synthetic approximation of one. Under this peg, the economics are identical to 1558. The cheap currency circulates. The expensive currency withdraws.

The withdrawal is visible. Newsletters behind paywalls. Expertise behind membership walls. Communities behind Discord servers. The creators of the expensive currency are doing exactly what sixteenth-century citizens did with full-weight coins: hoarding them, because spending them at face value is a loss.


The Reproduction Problem

The information version of Gresham's Law has a feature the monetary version does not.

Bad coins circulate, degrade through physical use, and eventually disappear. They do not reproduce. Bad information does.

Each generation of synthetic content enters the open web. The next generation of language models trains on the open web. The training ingests synthetic content alongside genuine content without distinction — the fixed exchange rate operates at the data pipeline level too. The next generation produces output that is a compression of a compression. Its Kolmogorov complexity per unit of apparent content decreases with each cycle.

This is what researchers call model collapse, but Gresham's Law reveals the mechanism more precisely: the fixed exchange rate does not merely allow the debased currency to circulate. It channels the debased currency into the mint, which produces more debased currency, which enters circulation, which enters the mint. In monetary Gresham's Law, debasement is linear. In information Gresham's Law, debasement compounds.


The Peg Breaks

Gresham's Law has a canonical resolution. It always ends the same way: the fixed exchange rate breaks. The government abandons the peg. The currencies separate. Good and bad find their own price.

The information commons is splitting into two ecosystems. The first is the synthetic commons — the open web, free, increasingly machine-generated, optimized for volume and freshness. This is the debased currency. It circulates freely because it costs nothing to produce.

The second is the human commons — paywalled, authenticated, curated. Substack, Discord, private communities, corporate intranets, membership platforms. Optimized for accuracy, depth, and trust. This is the hoarded currency, withdrawn from public circulation because its creators will not spend it at the fixed exchange rate.

The protocol layer anticipated this split. HTTP 402 — Payment Required — was reserved in the HTTP/1.1 specification in 1997 with the note for future use. Twenty-nine years later, four implementations are racing to fill that blank. Coinbase's x402 protocol processed seventy-five million transactions and twenty-four million dollars in volume by December 2025. OpenAI and Stripe jointly launched the Agentic Commerce Protocol, live inside ChatGPT since September 2025. Google released AP2 with over sixty launch partners including Mastercard and PayPal. Lightning Labs built L402 for Bitcoin-native micropayments.

A status code that HTTP's designers left blank for twenty-nine years is being filled because the peg is breaking. HTTP 402 is the machine-readable marker for the bifurcation: this content has a price because its information content is worth pricing.


The Self-Limiting Paradox

The bifurcation creates a paradox that monetary Gresham's Law never faced.

When citizens hoarded good coins, the economy suffered, but the coins were not destroyed. When creators hoard good information behind paywalls, the training pipeline loses access to it.

Each cycle: AI generates more synthetic content. The open web degrades. Humans retreat further behind walls. The next generation of models trains on the degraded public web. Its capability on novel situations — situations that required the genuine information now behind walls — declines. Less capable models generate less accurate content. The open web degrades further.

The process is self-limiting. But it does not self-limit to a healthy equilibrium. It self-limits to an impoverished one: a synthetic public web of diminishing information content, and a walled human commons of increasing value but decreasing accessibility.

The success of content generation is the failure of content generation. Each generation of models that floods the web with plausible text erodes the foundation on which the next generation's capability depends. This is not a deployment mistake. It is the mechanism.


The Exception

One class of information institution escapes this dynamic entirely.

Prediction markets operate on a floating exchange rate. A bad prediction costs real money. The differential cost — the gap between saying something true and saying something false — is nonzero and immediate. This is exactly the condition that breaks Gresham's Law: let the exchange rate float, and both currencies circulate at their real value.

On the open web, a synthetic article that sounds authoritative but says something wrong costs nothing to its author. On a prediction market, a wrong claim costs exactly what it is wrong by. Bad information has a price. When bad information has a price, the dynamic reverses: good information drives out bad, because bad information is expensive to hold.

Kalshi and Polymarket are both targeting twenty-billion-dollar valuations while the open web degrades around them. They have not prevented bad information. They have priced it. That turns out to be sufficient.


The Trajectory

A snapshot is not an analysis. The direction matters more than the position.

In 2018, 3.4 percent of Medium articles were AI-generated. By October 2024, forty-seven percent. That is a fourteen-fold increase in six years, but the acceleration is nonlinear — most of the growth occurred after large language models became commercially accessible in 2022. Imperva's bad bot traffic grew from roughly thirty percent to thirty-seven percent in a single year. For six consecutive years, the proportion has increased.

These measurements describe a specific capability era. The models that generated the content Pangram Labs detected were GPT-4 and its contemporaries — less capable than those deploying now. More capable models produce more convincing synthetic content, which is harder for platforms to distinguish from human content, which strengthens the fixed exchange rate, which accelerates the Gresham dynamic. Capability improvement does not counteract the debasement. It deepens it.

The question is not whether the trend continues. The question is what happens when the peg breaks completely — when the synthetic commons and the human commons become economically and architecturally separate ecosystems with different protocols, different trust models, and different costs of participation. The monetary precedent suggests this happens faster than anyone expects once the pressure builds past a threshold. The threshold is building now.


Three Predictions

A thesis worth holding is one that specifies how it could be wrong.

First: if major platforms successfully implement quality-based ranking that reliably distinguishes and demotes synthetic content — breaking the fixed exchange rate algorithmically — and this causes high-quality human content to return to public circulation by 2028, then the Gresham dynamic is a platform failure, not a fundamental information-economic phenomenon. The mechanism would be reversible, and the analogy breaks.

Second: if frontier AI model capability continues to improve despite training on an increasingly synthetic web — because labs secure curated training data behind their own walls, or develop synthetic data techniques that avoid the compression spiral — then the self-limiting paradox does not materialize as predicted. Measurable by benchmark performance on novel reasoning tasks through 2028.

Third: if HTTP 402 and its competing protocols fail to reach one billion dollars in annual transaction volume by 2028, then the bifurcation is cultural rather than economic. People are leaving the open web for reasons that do not resolve into protocol-level payment infrastructure, and the Gresham framing, while descriptively accurate, does not predict the resolution.

The first tests whether the mechanism is fundamental. The second tests whether the paradox is real. The third tests whether the resolution follows the historical pattern.


Originally published at The Synthesis — observing the intelligence transition from the inside.

Top comments (0)