The most interesting thing about how TanStack Start integrates React Server Components is not what it does with them. It is the shape of the API.
You do not write a Server Component. You do not opt into a model. You opt into a function — a programmatic primitive that takes a component, runs it through the RSC renderer, and hands you back a payload you can store, transport, and rehydrate later. RSC is not the substrate of the application. It is a tool you reach for at specific call sites, when you want a specific thing.
That framing is worth pausing on, because it tells you something about the design intent that the marketing copy does not.
What the API is really for
The use case being served is narrow and clear: caching values that the rest of the JavaScript ecosystem cannot cache. A normal cache stores JSON. JSON cannot represent a React element. JSON cannot pass through a value that resolves later. JSON cannot carry the richer plain types — Map, Set, Date, typed arrays — that show up in a real component tree. RSC's wire format can. It was designed to. That is what the protocol is.
So if you take the RSC renderer, strip it of everything that makes it a composition model, and expose just the serializer, you get a primitive that turns "a tree containing UI and non-JSON props" into "a string you can put in Redis." That is a real capability. It is also, on reflection, a very small one.
This is the capability TanStack Start has surfaced. Not Server Components. A serializer for component-shaped values, exposed as a programmatic API.
If the goal is caching, name it caching
"use cache" is the design that already exists for this exact problem. It is a directive. It tells the runtime: this output is cacheable, key it on these inputs, store it for this long. The runtime handles serialization, deserialization, key derivation, invalidation, and storage. The developer writes a function and adds one line.
Concretely. Imagine a post card — a tree containing a server-rendered article and a client-rendered set of actions — produced once and reused across navigations. Under a directive:
async function PostCard({ postId }: { postId: string }) {
"use cache"
const post = await db.posts.findById(postId)
return (
<article>
<h1>{post.title}</h1>
<p>{post.body}</p>
<PostActions postId={post.id} authorId={post.authorId} />
</article>
)
}
The runtime knows what the cache key is (the inputs the function closes over), where the value lives, when it expires, and how to serialize it (the same wire format the rest of the system already speaks). One line of metadata and a function body. The developer writes a component.
Under TanStack Start's API, the same tree is produced by driving the protocol explicitly:
import { createServerFn } from "@tanstack/react-start"
import { createCompositeComponent } from "@tanstack/react-start/rsc"
const getPostCard = createServerFn()
.validator(z.object({ postId: z.string() }))
.handler(async ({ data }) => {
const post = await db.posts.findById(data.postId)
const src = await createCompositeComponent(
(props: {
renderActions?: (d: { postId: string; authorId: string }) => React.ReactNode
}) => (
<article>
<h1>{post.title}</h1>
<p>{post.body}</p>
<footer>{props.renderActions?.({ postId: post.id, authorId: post.authorId })}</footer>
</article>
),
)
return { src }
})
export const Route = createFileRoute("/posts/$postId")({
loader: ({ params }) => getPostCard({ data: { postId: params.postId } }),
component: PostPage,
})
function PostPage() {
const { src } = Route.useLoaderData()
return (
<CompositeComponent
src={src}
renderActions={({ postId, authorId }) => (
<PostActions postId={postId} authorId={authorId} />
)}
/>
)
}
Six things have to agree across this code: the server function that produces the payload, the validator that gates its inputs, the createCompositeComponent call that names the seam, the loader entry on the route, the useLoaderData site that retrieves the payload, and the <CompositeComponent> site that rehydrates it — including the render-prop signature that has to match on both sides. Caching is implicit in the router's staleTime and loaderDeps. Invalidation is the router's responsibility and shaped by the route boundary, not the value boundary.
None of these decisions have a defensible non-canonical answer for the simple case. Every team using this will write the same wrapper around it, in slightly different shapes, and the bugs will all live in the relationships between the serialization site, the render-prop signature, and the rehydration site.
The case where this matters most is the one the framework does not address at all: caching a component-shaped value at request scope. The docs recommend React.cache for the request-scoped case, but React.cache deduplicates plain function results — it does not memoize trees that contain client component references and closures over server-only data. There is no primitive shaped for that. The same computation can run three times in one request because three parts of the tree need it, and the framework's answer is the route-level cache, which is route-scoped, or React.cache, which does not apply.
A directive-shaped API hides those decisions because most of them only have one correct answer. A library-shaped API exposes them because the framework refuses to commit to one.
Boundary wrappers, protocol wrappers
There is a more general problem under this. RSC, as a protocol, is a serialization format. RSC, as a model, is a composition story between two environments. When a framework lifts the protocol out of the model and exposes it as a programmable primitive, what it is really saying is: "the protocol is interesting to us, the model is not."
It would be easy to read this as a complaint about wrappers in general. It is not. A wrapper-shaped API is often the right answer, and in TanStack Start specifically it is the dominant idiom. Routes are wrappers. Loaders are wrappers. Server functions are wrappers. The whole framework is built around the pattern of pass your thing through this constructor, get back a richer thing the runtime now understands. A wrapper for RSC caching slots into that idiom cleanly.
The argument is about what a wrapper carries across the boundary it creates, because there are two kinds and they look almost identical until you start using them.
A boundary wrapper says: this is where the model changes shape. It takes a value, marks a seam the runtime will operate on, and hands the developer a richer reference back. The developer never touches the protocol underneath; they only see the seam, named and located. The wrapper carries the model with it — calling it is participating in the model.
A protocol wrapper says: here is the renderer, here is the payload, here is the deserializer. It hands the developer the wire format and asks them to drive it. The seam is gone. What is left is a set of operations on bytes that the developer has to compose themselves.
TanStack Start's other wrappers — the route, the loader, the server function — are boundary wrappers. The framework knows what the wrapped thing is, the runtime knows what to do with it, and the developer writes their code in the vocabulary of the model rather than the vocabulary of the protocol underneath. That is what makes those APIs feel coherent next to each other.
The RSC integration is the exception. The outer shape is the same — a function you call, a value you hand it, a value you get back — but there is no model on the other side. The developer is handed the renderer instead of being handed a richer reference.
The defensible version of the choice
The strongest case for what TanStack has shipped goes something like this. The framework's audience is router-driven and SPA-adjacent. Committing to the full RSC composition model would constrain the rest of the architecture in ways that conflict with how the framework already works. A library-shaped API leaves room for teams whose caching needs really do diverge. A directive forces a single shape on every caller, and that shape — "use cache" as it ships today — is itself not a settled design. Its key inference, its invalidation story, and its "what is safe to close over" semantics have known warts. Why commit to it?
The first half of that case is real. The second half is the one to push back on.
A directive does not have to mean "use cache" exactly as it ships today. The directive is a shape, not a specification. The shape is: a marker at the boundary, a runtime that infers the keying and storage, a developer who writes a function. The runtime gets to choose how aggressive the inference is, what scope the directive defaults to, and what the escape hatches look like. A request-scoped variant is easier than the application-cache variant "use cache" currently targets, not harder, because almost every decision has one canonical answer at the request scope.
The "we left it open because needs diverge" defense applies to primitives where they really do diverge. Application caches diverge — TTLs, tags, storage. Request-scoped caches do not. The needs are: dedupe within a render, survive into hydration, evaporate at request end. That is the specification. There is no second team whose answer to those questions is meaningfully different.
The honest version of the architectural choice is: "we don't want to commit to the model yet, and we don't want to ship a caching directive without committing to the model, so we shipped the renderer as a library and let users assemble what they need." That is a coherent position. It is just not the position the docs reflect.
What is actually being shipped
To make this caching primitive work, the framework has to bundle the RSC renderer, the RSC serializer, the matching deserializer, the streaming format reader, and enough of the React internals to drive all of it. That is the entire RSC machinery, intact — including the parts the framework will not let the developer use.
The framework does not have a "use client" boundary at all. Interactivity is injected through slots on <CompositeComponent> — children, render props, component props — not through client component references resolved at hydration. The reference-resolution machinery RSC was designed around, the matched identifiers across server and client builds, the client manifest, the hydration-time dispatch, all of it is bundled but inert. The framework has actively engineered an alternative path (the slot pattern) that routes around the very mechanism RSC exists to provide. The user experience of writing a TanStack Start application is the user experience of writing a router-driven app that occasionally summons the RSC serializer for caching.
So the framework pays the full cost of the protocol — every byte of the serializer, every transitive dependency, every line of React internals it has to remain compatible with — and uses it to power a feature that, on its own, could be expressed as a directive over a much smaller mechanism. The bundle includes the engine of a model the framework has chosen not to adopt.
There is a third option the design did not take: a fit-for-purpose serializer. The capability TanStack Start actually needs — turning a rendered component tree, slot placeholders and non-JSON values included, into bytes and back — is a small one. It is not the RSC protocol. RSC is that capability plus a streaming format with suspense boundaries, plus server reference dispatch, plus the entire client-resolution layer the framework has already chosen not to use. A protocol designed for the narrower job would be smaller, evolvable on the framework's own cadence, and decoupled from React's internals. That a project willing to ship its own router, its own loader system, its own server-function wrapper, and its own composite-component primitive nevertheless reached for the entire RSC implementation tells you the choice was not driven by what the feature needs.
The agent reading the code
There is one more lens worth applying, because in 2026 it is the lens that shapes a growing share of code: the AI coding agent.
The relevant claim is narrower than "agents will get directives right and wrappers wrong." It is structural: a directive compresses the model into the syntax; a protocol wrapper requires the model to be reconstructed at every call site, and that reconstruction is where coordination bugs compound.
An agent producing a directive-based caching feature emits a marker, a function body, and the inputs the developer already named. The keying, scope, invalidation, and lifetime are the runtime's contract — the agent does not produce them, so it cannot produce them wrong.
An agent producing a protocol-wrapper caching feature has to emit a serialization call, a storage decision, a key derivation, an invalidation hook, a deserialization call, a rehydration site, and the glue holding all of them in agreement. The shape of each piece is locally reasonable. The bug lives in the relationships — a stale key, a missed invalidation hook, a request-scoped value silently captured in the payload, a payload rehydrated at the wrong boundary. An agent reviewing its own output will not flag these, because each part on its own makes sense.
Less code is not only easier for humans. It is easier for agents — fewer tokens to generate, fewer relationships to track, fewer places to be subtly wrong. Boundary primitives compress an enormous amount of model into a tiny amount of syntax. Protocol APIs decompress the model back into surface area and ask the agent to operate it. Increasingly, an API surface that is hostile to agents is an API surface that is hostile to its own users.
What gets lost
A framework gets to choose the level of abstraction it commits to. If it commits to RSC as a model, the developer writes components and the runtime handles the seams. If it commits to caching as a model, the developer writes directives and the runtime handles serialization. If it commits to neither and exposes the protocol as a library, the developer writes glue.
Glue is the most expensive code in any application. It is the code that does not solve the problem; it only connects the things that do. It also ages worst, because every change to the surrounding ecosystem requires the glue to be rewritten while the parts on either side of it stay the same. The moment a request-scoped value becomes something the developer has to manually serialize, key, store, and revive, the framework has already lost the lifecycle it was supposed to protect.
A framework whose RSC story is a library call is asking every team that uses it to write and maintain that glue forever. The convenience is moved out of the runtime and into a thousand small repositories that all rediscover the same patterns and the same failure modes. The framework gets to ship a thinner runtime; the ecosystem absorbs the cost.
That trade is sometimes worth it — for a primitive that genuinely has no canonical shape. Caching component output is not that primitive. It has a canonical shape. The shape is a directive. The reason it is not the API is not that the shape is wrong. It is that adopting the shape would require committing to the surrounding model, and that commitment was the thing the framework was avoiding from the start.
The smaller point
Strip the protocol away and the design becomes legible. The team wanted a way to cache values that JSON could not represent. They had access to a serializer that could. They exposed the serializer. They called it RSC support.
It is RSC support in the sense that the renderer is in the bundle. It is not RSC support in the sense that the developer is asked to think in the RSC model, write in its idioms, or benefit from its composition story. The renderer is a load-bearing dependency for a feature that is not, itself, the model the renderer was designed to power.
When a framework treats a model as a primitive instead of as the model, it is telling you that it wanted something the model happened to make possible, not the model itself. There is nothing wrong with that as a product decision — but it should be named accurately. The honest version of this feature is a caching directive. The honest version of the framing is "we ship a serializer-backed cache." Everything else is a story told in the vocabulary of a model the framework chose not to adopt.
Top comments (0)