🚀 How I Optimized the Backend of My Feedback SaaS
In this blog, I’ll break down how I built my feedback collection app — but with a focus on backend architecture and performance improvements.
⚠️ Heads up: I won’t be diving into Next.js internals or frontend details. My focus lies in backend systems, data design, and the tradeoffs I made. Let's get into it.
🧰 What Did I Build?
At its core, my app works like Google Forms — but for targeted user feedback.
- You upload a list of emails in a CSV.
- Tag them by category (e.g., Developer, User, Project Manager).
- Create a topic like “Alpha Testing Feedback”.
- Select who should get that feedback request.
- They receive a magic link that opens the form with the topic and context.
- They submit feedback. Done ✅
It’s simple on the surface, but has some interesting backend challenges.
🧠 Schema Design and Structure
Here’s the simplified schema:
📌 Observation: The upper half of the schema is tightly coupled to the workspace, which represents the core business logic.
Things like payments or quota tracking are intentionally separated. This makes the code more modular and easier to scale.
❗ Challenge 1: One Person, Multiple Roles
In real teams, people wear multiple hats — someone might be both a Developer and a Project Manager.
I didn’t want to duplicate email entries across categories — that would lead to:
- Redundant data
- Ghost updates
- Harder data consistency
Instead, I introduced a pivot table to manage many-to-many relations between emails and roles.
That’s when things got tricky — on every new upload, I now had to:
- Check if an email already exists.
- Update its category associations without duplication.
Prisma Schema:-
model Category {
id String @id @default(cuid())
name String
workspaceId String
workspace Workspace @relation(fields: [workspaceId], references: [id], onDelete: Cascade)
emailEntries CategoryToEmailEntry[] // one category could be associated with multiple entries
... // one to many category -> pivotTable
@@unique([workspaceId, name])
}
model EmailEntry {
id String @id @default(cuid())
email String
name String?
workspaceId String
workspace Workspace @relation(fields: [workspaceId], references: [id], onDelete: Cascade)
categories CategoryToEmailEntry[] // here as well one email could be associated with multiple entries in pivot table
... // one to many emailEntry -> pivotTable
@@unique([workspaceId, email])
}
// Pivot table
model CategoryToEmailEntry {
categoryId String // just updates the associations no need to complicate this
emailEntryId String
...
@@id([categoryId, emailEntryId])
}
🧩 Email Association Logic (CSV Upload)
✅ Problem:
When adding emails to a workspace via CSV upload, each email can fall into one of these categories:
- New email: Never used and never associated with any category.
- Already associated: Email is already linked to the given category.
- Used elsewhere: Email exists but is associated with a different category.
🛠️ Solution:
- Case 1: Insert the email into the database and link it to the category via the pivot table.
- Case 2: Skip — no further action needed.
- Case 3: Identify and update the pivot table to associate the email with the new category.
⚙️ Optimizations:
-
Prisma
$transaction
ensures atomic operations (either all succeed or all fail). - Map usage replaces O(n²) filtering logic with O(n log n) performance.
- Edge case: Repeated uploads or attempts to re-associate the same email are gracefully handled.
Code :-
const emailSchema = z.object({
email: z.string().trim().email("invalidEmail Provided")
})
type emailRow = z.infer<typeof emailSchema>
interface CsvRow {
email?: string; // CSV might have missing or empty values
}
export async function POST(req: NextRequest) {
try {
const formData = await req.formData();
const workspaceId = formData.get("workspaceId");
const categoryId = formData.get("categoryId");
const csvFile = formData.get("csvFile");
// we take the data as a form
if (!workspaceId || !categoryId) {
return NextResponse.json({ success: false, message: "workspaceId is missing " }, { status: 400 });
}
if (!csvFile || !(csvFile instanceof Blob)) {
return NextResponse.json({ success: false, message: "CSV file missing" }, { status: 400 });
}
// verified if files and other things exist
const text = await csvFile.text();
const result = parse(text, {
delimiter: ",",
header: true, // assumes first row is column headers like "email"
skipEmptyLines: true,
});
// parsing the emails csv file
// if you are interested in core optimisations look for prisma transactions
if (result.errors.length > 0) {
console.error(" CSV Parse errors:", result.errors);
return NextResponse.json({ success: false, message: "CSV parsing failed" }, { status: 400 });
}
const emailData: CsvRow[] = result.data as CsvRow[];
const validRows: emailRow[] = [];
const invalidRows: { row: CsvRow; error: string }[] = [];
for (const row of emailData) {
const parsed = emailSchema.safeParse({
email: row.email?.trim()
})
if (parsed.success) {
validRows.push(parsed.data)
}
else {
invalidRows.push({ row, error: parsed.error.message })
}
}
const emails = Array.from(new Set(validRows.map((r) => r.email)))
if (!emails.length) {
return NextResponse.json({ success: false, message: "no unique/valid email" }, { status: 400 })
}
// till this we have done few things
// used zod to validate emails
// if they are valid then kept in a validRows array
// if they dont pass the zod validation value along with reason is saved in invalidRows array
// now since we have the data lets see the optimisations
const resultTx = await prisma.$transaction(async (tx) => {
const getAllExistingEmails:EmailEntry[] = await tx.emailEntry.findMany({
where:{
email:{in: emails},
workspaceId:workspaceId as string
}
})
// we are checking each email to see of they are added in database earlier or not
// to prevent duplicate if you remember
const existingEmailMap = new Map(
getAllExistingEmails.map((entry)=>[entry.email, entry])
)
// created a map for easy and constant lookup (maps are faster to lookup data and filter so first optimisation done)
const newEmails = emails.filter((email)=>(!existingEmailMap.has(email)))
// this checks the map by comparing emails field if nothing matches we return the emails
// now add all the new emails to database
const addNewEmails = await Promise.all(
newEmails.map((data)=>(
tx.emailEntry.create({
data:{
email:data,
workspaceId:workspaceId as string
}
})
))
)
// now link new email with category and add it to the pivot table
await tx.categoryToEmailEntry.createMany({
data:addNewEmails.map((data)=>(
{
emailEntryId:data.id,
categoryId:categoryId as string
}
))
})
// check which existing email is linked
// now this linkedemail could either be linked to category which user has currently given or existing one
// even if email is linked to any other category we still have to link it to our new categor
// so what do we have to sort -> emails that are linked with current category
// how do we do that getAllExisting Linked Emails
// check which one is already linked to our new/current category discard that we dont need that
// now link all the filtered emails to current category
// ignore this it was when i was figuring it out
// now we check which emails are already linked to some other category
const linkedEntriesOnly = getAllExistingEmails
// here we are checking condition 2 where email could be liked to give category already
const existingPivotLinks = await tx.categoryToEmailEntry.findMany({
where:{emailEntryId:{in: linkedEntriesOnly.map((e)=>e.id)},
categoryId:categoryId as string}
})
const existingPivotLinksMap = new Map(
existingPivotLinks.map((value)=>[value.emailEntryId, value])
)
// this new map has already linked emails to category condition 2
// now we filter to get not linked email to our current category condition 3
const notLinkedData = getAllExistingEmails.filter((emailData)=>(!existingPivotLinksMap.has(emailData.id)))
// finally add all not linked values to our table (it was linked with some other category)
if(notLinkedData.length > 0){
await tx.categoryToEmailEntry.createMany({
data:notLinkedData.map((data)=>({
emailEntryId:data.id,
categoryId:categoryId as string
}))
})
}
return [...addNewEmails, ...notLinkedData]
})
return NextResponse.json({
success: true,
added: resultTx.length,
skipped: invalidRows.length,
invalid: invalidRows, // if you want to show it in frontend
});
} catch (error) {
console.error("error happend while adding emails", error)
return NextResponse.json({ success: false, message: "error while adding emails" }, { status: 500 })
}
}
📩 Invite Email Logic
🎯 Problem:
You want to:
- Avoid inviting the same email twice to a topic.
- Prevent abuse — no spammy repeated invites.
- Limit unnecessary costs — SMTP isn’t free 💸.
- ✅ Track email usage — this system uses
emailQuota
tracking, which is explained separately.
🧠 Solution:
- Check if an invite was already sent for that
emailId + topicId
. - Store the status (
SENT
orFAILED
) in DB. - Update user’s
emailUsage
count only for successful sends. - Use
Promise.allSettled
to:- Offload email sending in parallel.
- Ensure that one failed email doesn't stop others — unlike
Promise.all
. - Useful for batch jobs where partial success is okay.
- Lets you track both
fulfilled
andrejected
results — so you can log failures, retry, or inform the user.
import { prisma } from "../../../lib/prisma";
import { NextResponse, NextRequest } from "next/server";
import crypto from "crypto";
import { validateQuota } from "@/lib/emailQuota";
import { sendInviteEmail } from "@/lib/sendInvite";
type InviteResult = {
status: "SENT" | "FAILED";
emailEntryId: string;
token: string;
errorMessage: string | null;
};
// type of response to track success/failure of a promise.
export async function POST(req: NextRequest) {
try {
const formData = await req.formData();
const emailIds = formData.getAll("emailIds") as string[];
const topicId = formData.get("topicId") as string;
const workspaceId = formData.get("workspaceId") as string;
if (!emailIds?.length || !topicId || !workspaceId) {
return NextResponse.json({ success: false, message: "Invalid form data" }, { status: 400 });
}
const quotaError = await validateQuota(workspaceId);
if (quotaError) return quotaError;
// till now everything is preety staightforward i hope
// now we check how many emailIds were already used for invite
const existing = await prisma.invitation.findMany({
where: { emailEntryId: { in: emailIds }, topicId },
select: { emailEntryId: true },
});
// we have used set to map values emailId (database generated Id not real one thats normalized :) )
const alreadyInvited = new Set(existing.map((e) => e.emailEntryId));
// this gives us new ids to invite
const toInvite = emailIds.filter((id) => !alreadyInvited.has(id));
// if nothing new then send that emails/invitations are already sent out
if (!toInvite.length) {
return NextResponse.json({ success: false, message: "All invites already sent" });
}
// now we get real world emails ids only for new/unused emailIds
const emailEntries = await prisma.emailEntry.findMany({
where: { id: { in: toInvite } },
});
// topic for which the emails need to be send off
const topic = await prisma.topic.findUnique({
where: { id: topicId },
select: { title: true, description: true },
});
if (!topic) {
return NextResponse.json({ success: false, message: "Topic not found" });
}
// lesson how to use Promise.allSettled with custom values to later filter the fulfilled/rejected Promises (personal message ignore)
// we have used Promise.allSettled here so that codes can run in parallel and things get offloaded to worker threads as well
// see explanation for better understanding why this is used
const result = await Promise.allSettled(
emailEntries.map((entry)=>{
const token = crypto.randomUUID() // this token will be used to validate invitation in future
const inviteLink = `${process.env.DOMAIN}/invite/?token=${token}`
return sendInviteEmail({
to:entry.email,
inviteLink,
topicDescription:topic.description,
topicTitle:topic.title // till this fairly simple stuff
}).then<InviteResult>(()=>({ // now we catch values in result as InviteResult type for filtering success/failed email sent attempts
status:'SENT' as const,
emailEntryId:entry.id,
token,
errorMessage:null
}))
.catch<InviteResult>((error)=>({
status:'FAILED' as const,
emailEntryId:entry.id,
token,
errorMessage:error?.message || "Unknown error"
}))
})
)
// now we filter how many were sent successfully and how many were not
const successfulInvite = result.filter((r): r is PromiseFulfilledResult<InviteResult> => r.status === "fulfilled")
.map((r)=>r.value)
.filter((r)=>r.status === "SENT")
const failedInvite = result.filter((r): r is PromiseFulfilledResult<InviteResult> => r.status ==="rejected")
.map((r)=>r.value)
.filter((r)=> r.status === "FAILED")
console.log(failedInvite,"failedInvite")
console.log(successfulInvite, "successfulInvite")
//we can add the expiryDate here for inviteSubmission.
// checking success length then adding them in database and then
if(successfulInvite.length > 0){
await prisma.$transaction(
successfulInvite.map((invite)=>
prisma.invitation.create({
data:{
emailEntryId:invite.emailEntryId,
token: invite.token,
topicId,
inviteStatus:"SENT",
sentAt:new Date(),
error:null
}
})
)
)
}
// we also add failed once in
if(failedInvite.length > 0){
await prisma.$transaction(
failedInvite.map((invite)=>
prisma.invitation.create({
data:{
emailEntryId:invite.emailEntryId,
topicId,
token: invite.token,
inviteStatus:"FAILED",
error:invite.errorMessage,
}
})
)
)
}
// once everything is done we update users emailUsage count
await prisma.emailUsage.update({
where:{
workspaceId
},
data:{
sentCount:{increment: successfulInvite.length}
}
})
return NextResponse.json({
success: true,
message: "Processed invites",
total: emailEntries.length,
sent:successfulInvite.length,
failed:successfulInvite.length
});
} catch (error) {
console.error("Send invite error:", error);
return NextResponse.json({ success: false, message: "Internal Server Error" }, { status: 500 });
}
}
EmailQuota
Code-
import { auth } from "@clerk/nextjs/server";
import { prisma } from "@/lib/prisma";
import { NextResponse } from "next/server";
export async function validateQuota(workspaceId: string) {
const startOfMonth = new Date();
startOfMonth.setDate(1);
startOfMonth.setHours(0, 0, 0, 0);
const { userId } = await auth();
if (!userId) {
return NextResponse.json({ success: false, message: "Please log in before inviting." });
}
// Ensure usage entry exists or is reset for the new month
const existingUsage = await prisma.emailUsage.findUnique({ where: { workspaceId } });
if (!existingUsage) {
await prisma.emailUsage.create({ data: { workspaceId, month: startOfMonth, sentCount: 0 } });
} else if (existingUsage.month.getTime() < startOfMonth.getTime()) {
await prisma.emailUsage.update({
where: { workspaceId },
data: { sentCount: 0, month: startOfMonth },
});
}
const emailUsage = await prisma.emailUsage.findUnique({ where: { workspaceId } });
const subscription = await prisma.activeSubscription.findFirst({
where: { clerkId: userId, isCancelled: false },
});
// Handle free users
if (!subscription && emailUsage && emailUsage.sentCount >= 500) {
return NextResponse.json({
success: false,
message: "Free tier exhausted. Wait until next month or upgrade.",
});
}
// Handle subscribed users with a pricing plan limit
if (subscription && emailUsage) {
const pricing = await prisma.pricing.findUnique({
where: { id: subscription.pricingId },
select: { emailUsageLimit: true },
});
if (
pricing?.emailUsageLimit !== null &&
pricing?.emailUsageLimit !== undefined &&
emailUsage.sentCount >= pricing.emailUsageLimit
) {
return NextResponse.json({
success: false,
message: "You have exhausted your quota for this month based on your plan.",
});
}
}
return null; // All good
}
💳 Payments
There's not much to optimize in the payment flow itself, but a few deliberate design choices made things clean and secure.
First, the frontend only sends a pricingId
— this maps to the actual price on the backend, so users can't tamper with payment amounts. This separation makes the flow safer: the backend controls the price; the frontend just selects a plan.
We've also prevented subscription stacking — a user can't subscribe to multiple plans at once. This made sense for our use case, where simplicity was more important than supporting plan upgrades/downgrades.
A key design decision here is that payment is tied directly to the clientId
(via Clerk) rather than the workspace. This allows users to get started immediately — they don't have to first create a workspace to purchase a subscription. Even if they later delete the workspace, their plan benefits stay intact, since they're tied to the user, not the workspace.
We use a transaction to ensure everything (activating a plan, saving the order, creating payment records) happens atomically — no inconsistent state if something fails midway.
I’ve also added a bit of redundant data by linking the active subscription directly to clerkId
. Since this is mostly read-heavy data, it's worth the small duplication to avoid traversing from orders → payments → subscriptions
every time.
I won’t go deep into Razorpay’s payment lifecycle here — this blog focuses on the business logic and trade-offs I made, not payment gateway internals
Order creation code
import { NextRequest, NextResponse } from "next/server";
import Razorpay from "razorpay";
import z from "zod";
import { prisma } from "../../../lib/prisma";
import { auth } from "@clerk/nextjs/server";
import { OrderStatus } from "@prisma/client";
type RazorpayOrderResponse = {
id: string;
entity: "order";
amount: number;
amount_paid: number;
amount_due: number;
currency: string;
receipt: string;
offer_id: string | null;
status: "created" | "attempted" | "paid";
attempts: number;
notes: Record<string, string>;
created_at: number;
};
// const Subscription = z.enum(["MONTHLY", "QUARTERLY", "HALF_YEARLY", "YEARLY"]);
const orderSchema = z.object({
pricingId: z.string(),
clerkId: z.string(),
});
if (!process.env.RAZOR_KEY_ID || !process.env.RAZOR_SECRET_ID) {
throw new Error("Razorpay environment variables are missing");
}
const razorPay = new Razorpay({
key_id: process.env.RAZOR_KEY_ID as string,
key_secret: process.env.RAZOR_SECRET_ID as string,
});
// we are taking pricing plan for order creation
// find the plan user want to subscribe
// then create a order for that subscription plan
// if user has already an active plan we are not letting them stack plans
export async function POST(req: NextRequest) {
try {
const body = await req.json();
const parsedData = orderSchema.safeParse(body);
if (!parsedData.success) {
return NextResponse.json(
{
success: false,
message: `validation failed: ${parsedData.error.message}`,
},
{ status: 400 }
);
}
// lets check active subscription
const checkSubscription = await prisma.activeSubscription.findFirst({
where: {
clerkId: parsedData.data.clerkId,
isCancelled: false,
expiryDate: {
gt: new Date(),
},
},
});
if (checkSubscription) {
return NextResponse.json(
{ success: true, message: "You already have an active subscription" },
{ status: 200 }
);
}
// check subscription user want to pay for
const fetchPricingPlan = await prisma.pricing.findUnique({
where: {
id: parsedData.data.pricingId,
},
});
if (!fetchPricingPlan) {
return NextResponse.json(
{
success: false,
message: "could not find the plan",
},
{ status: 404 }
);
}
// create new order
const createNewOrder = (await razorPay.orders.create({
amount: fetchPricingPlan.price,
currency: fetchPricingPlan.currency,
receipt: parsedData.data.clerkId,
notes: {
Subscription: fetchPricingPlan.subscription,
clerkId: parsedData.data.clerkId,
},
})) as RazorpayOrderResponse;
await prisma.order.create({
data: {
amount: fetchPricingPlan.price,
clerkId: parsedData.data.clerkId,
subscription: fetchPricingPlan.subscription,
razorPayOrderId: createNewOrder.id,
pricingId: parsedData.data.pricingId,
status:"CREATED"
},
});
return NextResponse.json({
success: true,
order: createNewOrder,
});
} catch (error) {
console.error("error happened while creating the order: ", error);
return NextResponse.json(
{
success: false,
message: "Internal Error happend Try Again Sorry for inconvience",
},
{ status: 500 }
);
}
}
Payments Code
import { NextRequest, NextResponse } from "next/server";
import crypto from "crypto"
import { prisma } from "@/lib/prisma";
import { auth } from "@clerk/nextjs/server";
export async function POST(req: NextRequest) {
try {
const body = await req.json();
const { razorpay_order_id, razorpay_payment_id, razorpay_signature } = body;
const hmac = crypto.createHmac("sha256", process.env.RAZOR_SECRET_ID!)
hmac.update(`${razorpay_order_id}|${razorpay_payment_id}`);
const generatedSign = hmac.digest("hex")
const isValid = generatedSign === razorpay_signature
if (!isValid) {
return NextResponse.json({
success: false,
message: "payment data tampered"
})
}
const { userId } = await auth()
if (!userId) {
return NextResponse.json({
success: false,
message: "missing auth at server"
})
}
const getOrder = await prisma.order.findUnique({
where: {
clerkId: userId,
razorPayOrderId: razorpay_order_id
}
})
if (!getOrder) {
return NextResponse.json({
success: false,
message: "linked order could not be found"
})
}
const updateTx = await prisma.$transaction(async (tx) => {
await tx.order.update({
where: {
id: getOrder.id
},
data: {
status: "PAID"
}
})
const payments = await tx.payment.create({
data: {
razorPayPaymentId: razorpay_payment_id,
razorPayOrderId: razorpay_order_id,
razorPaySignature: razorpay_signature,
orderId: getOrder.id,
status: "SUCCESS",
isVerified: true,
}
})
const fetchPlan = await tx.pricing.findUnique({
where: {
id: getOrder.pricingId
}
})
if (!fetchPlan?.validity) {
throw new Error("Pricing plan missing validity");
}
// Calculate expiry date
const expiryDate = new Date();
expiryDate.setDate(expiryDate.getDate() + fetchPlan.validity);
const subs = await tx.activeSubscription.create({
data: {
clerkId: userId,
paymentId: payments.id,
pricingId: getOrder.pricingId,
expiryDate: expiryDate
}
});
return subs;
})
return NextResponse.json({
success:true,
message:"Payment Linked successfully.",
data:updateTx
})
} catch (error) {
console.error("error happend while trying to verify the secret", error);
return NextResponse.json({
success: false,
message: "error happend at our side."
}, { status: 500 })
}
}
🚧 Wrapping Up
This marks the end of my blog. The current implementation handles a lot — from invite creation to email dispatch and retry logic.
But let’s be honest: it’s not failproof.
Right now, failed email invites are only tracked (thanks to database), but we don't yet have a system to reprocess them automatically. To build a more resilient backend, a message broker like RabbitMQ or a managed queue system (like Supabase Edge Functions with retries or Resend webhooks) could be introduced.
This would mean:
- A separate, persistent job server.
- Better fault tolerance.
- Smarter retries and dead-letter queues.
The diagram I’ve added is a rough visual of where this might go — a system that scales better and handles edge cases more gracefully.
But for now, this is where I pause the journey.
🔄 Future Plans
- ⏳ Replace basic retry logic with a queue (RabbitMQ / Resend / Supabase functions).
- 🔁 Implement smart retries + dead-letter tracking.
- 🧪 Add alerting/monitoring for failed jobs.
- 🧱 Make the backend more fault-tolerant and distributed.
- 📈 Eventually move to a dedicated worker service for background jobs.
Backend Diagram For Future
💬 A Note to Readers
If I’ve made mistakes — I want to learn. I’m still figuring things out and it’s a bitter pill to swallow sometimes when a month’s work still feels incomplete. But if that’s what it takes to grow into a better developer, I’m here for it.
Feedback and criticism are more than welcome.
Thanks for reading.
Top comments (0)