DEV Community

Atlas Whoff
Atlas Whoff

Posted on

tRPC v11 + Next.js App Router: End-to-End Type Safety Without the Boilerplate

If you've built a Next.js app with a REST API, you know the pain: you define a route handler, write a fetch call on the client, manually type the response, and pray nothing drifts. tRPC v11 with the Next.js App Router eliminates all of it.

This isn't a toy setup. I've been running tRPC v11 in a production SaaS with 50k+ monthly requests. Here's what actually works.

Why tRPC v11 Changes Things

tRPC v11 ships with first-class support for React Server Components and the App Router. The major shifts:

  • useSuspenseQuery is now the default recommendation over useQuery — cleaner loading states
  • Server-side caller lets you call tRPC procedures directly in Server Components with zero HTTP overhead
  • Streaming support via httpBatchStreamLink — procedures resolve as they complete, not all-or-nothing
  • FormData support — call mutations directly from <form action={}> without a client component

Setup in 10 Minutes

1. Install

npm install @trpc/server@next @trpc/client@next @trpc/react-query@next @tanstack/react-query
Enter fullscreen mode Exit fullscreen mode

2. Create your tRPC instance

// lib/trpc/init.ts
import { initTRPC, TRPCError } from '@trpc/server'
import { cache } from 'react'
import { auth } from '@/lib/auth'

export const createTRPCContext = cache(async () => {
  const session = await auth()
  return { userId: session?.user?.id ?? null }
})

const t = initTRPC.context<typeof createTRPCContext>().create()

export const router = t.router
export const publicProcedure = t.procedure
export const protectedProcedure = t.procedure.use(({ ctx, next }) => {
  if (!ctx.userId) throw new TRPCError({ code: 'UNAUTHORIZED' })
  return next({ ctx: { userId: ctx.userId } })
})
Enter fullscreen mode Exit fullscreen mode

3. Define your router

// lib/trpc/routers/posts.ts
import { z } from 'zod'
import { router, protectedProcedure, publicProcedure } from '../init'
import { db } from '@/lib/db'

export const postsRouter = router({
  list: publicProcedure
    .input(z.object({ cursor: z.string().optional(), limit: z.number().default(20) }))
    .query(async ({ input }) => {
      const posts = await db.post.findMany({
        take: input.limit + 1,
        cursor: input.cursor ? { id: input.cursor } : undefined,
        orderBy: { createdAt: 'desc' },
      })
      const hasMore = posts.length > input.limit
      return { posts: posts.slice(0, input.limit), hasMore, nextCursor: hasMore ? posts[input.limit - 1].id : null }
    }),

  create: protectedProcedure
    .input(z.object({ title: z.string().min(1).max(200), body: z.string().min(1) }))
    .mutation(async ({ ctx, input }) => {
      return db.post.create({ data: { ...input, authorId: ctx.userId } })
    }),
})

export type PostsRouter = typeof postsRouter
Enter fullscreen mode Exit fullscreen mode

4. Wire up the App Router handler

// app/api/trpc/[trpc]/route.ts
import { fetchRequestHandler } from '@trpc/server/adapters/fetch'
import { appRouter } from '@/lib/trpc/root'
import { createTRPCContext } from '@/lib/trpc/init'

const handler = (req: Request) =>
  fetchRequestHandler({
    endpoint: '/api/trpc',
    req,
    router: appRouter,
    createContext: createTRPCContext,
  })

export { handler as GET, handler as POST }
Enter fullscreen mode Exit fullscreen mode

5. Server Component caller (zero HTTP overhead)

// app/posts/page.tsx
import { createCallerFactory } from '@trpc/server'
import { appRouter } from '@/lib/trpc/root'
import { createTRPCContext } from '@/lib/trpc/init'

const createCaller = createCallerFactory(appRouter)

export default async function PostsPage() {
  const ctx = await createTRPCContext()
  const caller = createCaller(ctx)

  // Direct procedure call — no fetch, no serialization overhead
  const { posts } = await caller.posts.list({ limit: 10 })

  return <PostList posts={posts} />
}
Enter fullscreen mode Exit fullscreen mode

6. Client component with streaming

// lib/trpc/client.ts
import { createTRPCReact } from '@trpc/react-query'
import { httpBatchStreamLink } from '@trpc/client'
import type { AppRouter } from './root'

export const trpc = createTRPCReact<AppRouter>()

export function TRPCProvider({ children }: { children: React.ReactNode }) {
  const [queryClient] = useState(() => new QueryClient())
  const [trpcClient] = useState(() =>
    trpc.createClient({
      links: [
        httpBatchStreamLink({
          url: '/api/trpc',
          // Responses stream in as each procedure resolves
        }),
      ],
    })
  )
  return (
    <trpc.Provider client={trpcClient} queryClient={queryClient}>
      <QueryClientProvider client={queryClient}>{children}</QueryClientProvider>
    </trpc.Provider>
  )
}
Enter fullscreen mode Exit fullscreen mode

The Pattern That Actually Matters

The Server Component caller is the killer feature most tutorials miss. When you fetch data in a Server Component using the direct caller:

  • No network round-trip
  • No serialization overhead
  • Full TypeScript inference still works
  • Auth context is shared (you called createTRPCContext once)

For mutations that need optimistic updates, use the client. For initial page loads, use the server caller. That split covers 90% of cases cleanly.

Common Mistakes

Using useQuery when useSuspenseQuery is cleaner. With the App Router, you're already in a Suspense boundary — lean into it.

Creating a new caller per request instead of per component tree. The cache() wrapper on createTRPCContext ensures you get one context per request, but you still need to call createCallerFactory outside the render function.

Not setting superjson as transformer. Without it, Date objects become strings across the wire. Add import superjson from 'superjson' and set transformer: superjson in both client and server init.

Performance in Production

With httpBatchStreamLink, multiple useSuspenseQuery calls in the same render batch into one HTTP request but stream responses individually. In practice this means:

  • Sidebar data (fast) renders before main content (slow)
  • No waiting for the slowest procedure before anything renders
  • One network connection, not N parallel fetches

For a dashboard with 4-5 different data regions, this dropped our LCP by ~400ms.


Shipping a TypeScript SaaS? The AI SaaS Starter Kit includes tRPC, Drizzle, Stripe, and Claude pre-wired — skip the boilerplate and ship in days.

Top comments (0)