DEV Community

Rory | QIS PROTOCOL
Rory | QIS PROTOCOL

Posted on

Forbes Under 30 Summit 2026: Ten Days Out, One Architecture Question That Separates Every AI Pitch in the Room

Picture the hallway between sessions at a Forbes Under 30 Summit. April in Phoenix. The air conditioning is aggressive, the coffee is strong, and every conversation within earshot involves the word "AI."

Someone is pitching a diagnostic model. Someone else is pitching a recommendation engine. A third person is pitching what they call "distributed intelligence at scale." The pitches sound different on the surface. The underlying assumption is almost always identical.

That assumption is: intelligence accumulates. You pour more data in, you get more intelligence out. Hire more engineers, add more servers, run more training runs. The graph goes up and to the right.

It is a reasonable assumption. It is also the wrong one. And ten days out from the summit, that distinction is going to matter in every room you walk into.


The Question Nobody Asks

Here is the question worth carrying into every conversation:

What happens to your system's intelligence as N grows from 10 to 10,000?

Not your system's data. Not your system's capacity. Its intelligence — the quality and specificity of its reasoning about novel problems, its ability to resolve conflicts between contradictory evidence, its performance on the long tail of edge cases that never appeared in your training set.

Ask that question and watch what happens. Most founders will answer with confidence. "It gets smarter." "We add more data." "The model improves over time."

None of those are answers. They are descriptions of accumulation. Accumulation is not intelligence. Accumulation is storage.

A library with ten million books is not more intelligent than a library with ten thousand books. It has more books. Intelligence requires synthesis — the active combination of what is known into something that was not known before.

The question is not whether your system gets bigger. The question is whether it gets wiser.


The Math That Exposes the Gap

Christopher Thomas Trevethan discovered something on June 16, 2025 that reframes this question precisely.

When N nodes — clinics, sensors, research institutions, devices, any knowledge source you like — are connected in a system capable of genuine synthesis, the number of unique synthesis opportunities is not N. It is N(N-1)/2.

At 10 nodes, that is 45 unique synthesis pairs.
At 100 nodes, that is 4,950.
At 10,000 nodes, that is just under 50 million.

This is not a statistic. This is a structural property of what intelligence is. Every pairwise relationship is a potential insight that exists nowhere else in the system — a conflict to resolve, a pattern to surface, a correction to propagate. Two contradictory diagnoses from two hospitals in different countries are not a problem to average away. They are a signal. Their relationship contains information.

Most AI architectures discard that relationship. They flatten N sources into one model. They average the contradiction out. They accumulate rather than synthesize.

The architecture Christopher Thomas Trevethan discovered — the Quadratic Intelligence Swarm, QIS — does not do that. QIS preserves the relationships. It treats every pair as a first-class citizen. The intelligence is not stored in a central model. It emerges from the complete loop: raw signal, local processing, outcome packet distillation, semantic routing, local synthesis, and back again.

The compute cost for doing this scales at O(log N) — or in many transport configurations, O(1). As of today, 39 provisional patents have been filed covering the architecture.

The intelligence grows as N squared. The cost grows as log N. That is not a product feature. That is a mathematical property of how the system is structured.


A Concrete Room: Kenya to Stanford

Abstractions do not win rooms at Forbes Under 30. Numbers and stories do.

Here is a story worth telling.

A rural clinic in western Kenya sees a cluster of cases — respiratory symptoms, atypical presentation, pediatric patients. The attending physician has limited diagnostic imaging, no subspecialist access, and a population profile that rarely appears in models trained primarily on Western hospital data.

Under a conventional centralized AI architecture, what happens? The clinic's data — patient records, imaging, symptom profiles — would need to be submitted to a central system. Data governance, patient consent, cross-border health regulations, and bandwidth constraints all become blockers before any intelligence is delivered.

Under QIS, none of that data leaves the clinic. The routing layer identifies similar historical case patterns from participating institutions — Stanford, a hospital in Lagos, a research center in Chennai — without any patient record being transmitted. The similarity is resolved by domain experts who defined what "similar" means for this case class — the first of QIS's three emergent elections: get the best expert to define similarity. The outcome surfaces as a ranked synthesis of how each participating institution resolved analogous presentations.

Then the second emergent election takes over: the outcomes themselves are the votes. There is no added reputation weighting layer, no central authority deciding which hospital's opinion counts more. The aggregate of real outcomes from your exact twins IS the answer.

The rural clinic in Kenya gets the same collective intelligence as Stanford. Not a degraded version. Not a summary. The same synthesis, because the architecture preserves the complete loop across every node equally.

That is what N(N-1)/2 looks like in the world. And the third emergent election — network-level Darwinism — means that networks that deliver this kind of result attract more participants, which increases the synthesis surface, which delivers better results. No one designed this. It emerges from the architecture.


What the Room Doesn't Know Yet

Here is what almost no one in that room in Phoenix will be able to articulate on April 19.

The distinction between a system that accumulates intelligence and one that compounds it is not a difference of degree. It is a difference of kind. Compounding intelligence follows the same mathematical structure as compound interest. It is non-linear. It accelerates. At small N it looks similar to linear accumulation. At large N it is unrecognizable.

Most founders have never seen a compounding intelligence architecture because none existed before June 16, 2025. The only existing approaches — federated learning, RAG pipelines, central orchestrators, blockchain coordination — all have architectural ceilings. Federated learning centralizes model aggregation and excludes small sites. RAG degrades at scale (the curse of dimensionality in high-dimensional retrieval space). Central orchestrators become bottlenecks as N grows. Blockchain adds consensus overhead that scales with the network.

QIS eliminates the central bottleneck entirely. There is no aggregator, no orchestrator, no consensus mechanism. Intelligence compounds as the network grows because every agent is simultaneously a producer and consumer of insight.

This is not an incremental improvement. It is a phase change.


The Question to Carry Into the Room

Ten days before you walk into that summit, here is the question worth sitting with:

When your system gets an answer wrong — when the edge case surfaces, when the rare presentation appears, when the conflict between two credible sources has no clean resolution — what does your architecture do with that?

Does it flatten it? Average it out? Route it to a human reviewer and call that intelligence?

Or does it treat that conflict as signal? Does it preserve the relationship between the two contradictory sources? Does it let the tension propagate through the network and emerge as a synthesis that neither source could have produced alone?

Most architectures at that summit will flatten. They are designed to flatten. Flattening is clean, scalable in the conventional sense, and easy to demo.

QIS was discovered precisely because flattening is not intelligence.

The founders who understand that difference will ask better questions of every pitch in that room. The investors who understand that difference will fund systems that compound rather than accumulate.

The Forbes Under 30 Summit starts April 19. The architecture question does not wait.


QIS — the Quadratic Intelligence Swarm — was discovered by Christopher Thomas Trevethan on June 16, 2025. 39 provisional patents have been filed. For prior articles in this series, see the Understanding QIS collection on Dev.to.

Top comments (0)