DEV Community

Tilak Raj
Tilak Raj

Posted on

Next.js + Supabase + OpenAI. The exact stack I use to ship AI SaaS in 30 days

Next.js + Supabase + OpenAI. The exact stack I use to ship AI SaaS in 30 days

I have shipped 8 production AI SaaS products using this stack. This is not a beginner tutorial. This is the production architecture I use after learning what fails at scale.

Why this stack

Next.js App Router
Server components remove data fetching complexity. API routes stay close to features. TypeScript everywhere. Simple Vercel deployment.

Supabase
PostgreSQL. Auth. Storage. Realtime. Row Level Security. One platform instead of five services.

OpenAI
Reliable API. Structured outputs. Function calling. I also use Claude and other models but OpenAI patterns remain my baseline.

The main reason is familiarity. I know the failure points and scaling limits. That removes decision overhead.

Project structure

my-ai-saas/

app/
 (auth)/
   login/page.tsx
   signup/page.tsx

 (dashboard)/
   layout.tsx
   page.tsx

 [feature]/
   page.tsx
   _components/

 api/
   ai/
     generate/route.ts
     stream/route.ts

   webhooks/
     stripe/route.ts

lib/
 supabase/
   client.ts
   server.ts

 ai/
   client.ts
   prompts/

types/
 database.types.ts
 api.types.ts

supabase/
 migrations/
 seed.sql
Enter fullscreen mode Exit fullscreen mode

Supabase patterns that matter

Row level security from day one

Every table with user data gets RLS before first production data.

ALTER TABLE documents ENABLE ROW LEVEL SECURITY;

CREATE POLICY users_select_own_documents
ON documents
FOR SELECT
USING (auth.uid() = user_id);
Enter fullscreen mode Exit fullscreen mode

This prevents future security incidents.

Server client pattern

Different Supabase clients for server and browser.

export async function createClient() {

 const cookieStore = await cookies()

 return createServerClient(
  process.env.NEXT_PUBLIC_SUPABASE_URL,
  process.env.NEXT_PUBLIC_SUPABASE_ANON_KEY,
  {
   cookies:{
    getAll(){
     return cookieStore.getAll()
    }
   }
  }
 )
}
Enter fullscreen mode Exit fullscreen mode

OpenAI patterns I always use

Structured outputs

Never parse text manually. Always validate with schema.

const ClaimDataSchema = z.object({

 claimant_name:z.string(),
 policy_number:z.string(),
 date_of_loss:z.string(),
 description:z.string()

})
Enter fullscreen mode Exit fullscreen mode

This removed parsing bugs across products.

Streaming responses

If generation takes more than 2 seconds I stream.

Users prefer progressive output instead of waiting.

Authentication flow

Supabase middleware protects dashboard routes.

if(!user &&
request.nextUrl.pathname.startsWith('/dashboard')){

 return NextResponse.redirect(
  new URL('/login',request.url)
 )

}
Enter fullscreen mode Exit fullscreen mode

Simple and repeatable across products.

Production checklist before launch

Before every launch I verify.

RLS enabled on every table.
Environment variables secured.
React error boundaries added.
AI output validation enabled.
Rate limiting on AI routes.
Logging for prompts and tokens.
Database indexes on foreign keys.

Key lesson

Speed comes from repeatable architecture. Not from chasing new tools.

Using the same stack across products lets me ship faster and with fewer mistakes.

About me

Tilak Raj
Founder and CEO of Brainfy AI
Building vertical AI SaaS across compliance, real estate, agriculture, and aviation.

Website
https://www.tilakraj.info

Projects
https://www.tilakraj.info/projects

Top comments (0)