DEV Community

Cover image for Next.js Auth and Dashboard Boilerplate: Building a Full-Stack Web App
Martin Persson
Martin Persson

Posted on

Next.js Auth and Dashboard Boilerplate: Building a Full-Stack Web App

Table of Contents

Introduction

Next.js Auth Boilerplate Project

Welcome to the Next.js Auth Boilerplate Project. This is a full-stack application, built on the Next.js framework, equipped with a comprehensive authentication system. Designed to serve as a solid starting point, it integrates essential technologies to facilitate a smooth development process for modern web applications. Whether you're initiating a new project or seeking a robust foundation, this boilerplate provides the groundwork for swift and efficient web development.

For hands-on experience, you can explore the live demo and delve into the GitHub repository. Further into this post, you'll find detailed explanations on each component and feature.

Live demo: https://next-boilerplate-umber.vercel.app/

Github repo

Main Features

  • User Authentication: Utilizing Passport, this boilerplate offers a robust login and registration system, complete with persistent sessions. The use of local strategies ensures flexibility and security in handling user authentication.
  • Mail support: With the integration of nodemailer and brevo, the system can handle email verification and password resets, ensuring a complete and user-friendly authentication process.
  • Styling: Utilizes Emotion and Material-UI for responsive and aesthetic design.
  • Data Management: The combination of SWR, Axios, and MongoDB offers efficient data handling. SWR's caching strategy works with Axios's fetching capabilities to provide a seamless user experience, while MongoDB serves as a robust database solution.
  • Form Handling: The integration of Formik and Zod allows for concise form validations. Formik's intuitive API, along with Zod's schema validation, ensures that the data is handled correctly, providing a more reliable user experience.
  • Testing: Cypress is used for end-to-end testing, ensuring that the application functions as intended across different scenarios. Its interactive test runner and real browser environment make it a preferred choice for comprehensive testing.
  • Deployment: Hosted on Vercel, the application benefits from a seamless deployment process. Vercel's continuous deployment and serverless functions align well with the Next.js architecture, making it an ideal choice for this project.

For an in-depth look, you can explore the GitHub repository or see the live demo. In the following sections, we will delve into each aspect of the project, providing insights into the technologies used and how they are configured.

Technology Overview

Here's a brief look at some key technologies used in this project:

  • 🌐 Next.js: The foundational framework for building the React application.
  • 💅 Emotion and Material-UI: These libraries are used to style the components, ensuring responsive and aesthetically pleasing design.
  • 🔐 Passport: Manages user authentication with various strategies, including local.
  • 📝 Formik & Zod: Handle and validate forms, making sure the data is correct.
  • 📊 SWR & Axios: Deal with data fetching and state management.
  • 🔒 bcryptjs: Secures user passwords by hashing them.
  • 📧 nodemailer: Sends emails for things like verification and password resets.
  • ⚙️ next-connect: A small Express/Connect-style middleware framework for Next.js, facilitating efficient handling of server-side routes and middleware.
  • 🗄️ MongoDB: The chosen database for this project, utilized for storing user information and managing sessions.
  • 📚 TypeScript: Adds static typing to JavaScript, enhancing code quality.
  • 🧪 Cypress: Provides end-to-end testing capabilities, ensuring that the application works as intended.
  • 🚀 Vercel: The platform for hosting the application, providing a seamless deployment process.

These tools are chosen based on their functionality and fit for the project. Feel free to explore the code to see how they are implemented and how they work together.

Architecture and Code Structure

The architecture of the Next.js Auth Boilerplate project is designed to be modular and scalable, following industry best practices. Here's an overview:

  • Frontend (Client-side): Built with Next.js, the frontend uses React components for UI, styled with Emotion and Material-UI, and handles forms via Formik and Zod.
  • Backend (Server-side): Utilizes a combination of Next.js API routes, Passport for authentication, and MongoDB for data storage. Axios and SWR assist in data fetching and state management.
  • Testing: Cypress is integrated into the workflow to facilitate end-to-end testing.
  • Deployment: Vercel is the preferred choice for hosting, providing a smooth deployment process.

Serverless API Structure with Next.js

In the Next.js Auth Boilerplate project, the backend architecture leverages Next.js's built-in support for serverless functions to create a streamlined and scalable API structure. This approach offers various benefits, such as automatic scaling, isolation of functions, and a simplified development experience. Here's an overview of the core components:

Next.js API Routes

API routes in Next.js allow you to build your API within the same project as your Next.js application. These routes are automatically treated as serverless functions and are found in the pages/api directory. Each file within this directory corresponds to an endpoint, and the exported function defines the behavior of that endpoint. This leads to a clear and concise structure that aligns with modern serverless practices.

Next-Connect

To enhance the creation and management of API routes, the project utilizes the next-connect library. Next-connect provides a lightweight layer to work with middleware and handle HTTP methods within Next.js API routes more efficiently.

Here's an example of using next-connect to create a route:

import nextConnect from 'next-connect';

const handler = nextConnect();

handler.get((req, res) => {
  // Handle GET request
});

handler.post((req, res) => {
  // Handle POST request
});

export default handler;
Enter fullscreen mode Exit fullscreen mode

Next-connect simplifies the process of defining various HTTP methods and integrating middleware, leading to cleaner and more maintainable code.

Serverless Benefits

Embracing a serverless architecture with Next.js API routes and next-connect offers several advantages:

  • Automatic Scaling: The serverless functions scale automatically with demand, providing efficient resource utilization.
  • Isolation: Each API route functions independently, reducing the risk of one route affecting others, improving stability and security.
  • Ease of Deployment: By integrating the API within the Next.js project, the deployment process is unified, leveraging platforms like Vercel for a seamless experience.
  • Rapid Development: The straightforward structure and tools like next-connect facilitate a faster development cycle, allowing for iterative enhancements and flexible adaptations.

Type Safety with Zod

One of the key aspects of the architecture in the Next.js Auth Boilerplate project is the use of Zod to enforce type safety across both the backend and frontend. This approach promotes consistency and robustness in the codebase. Here's how it's done:

  • Schema Definition: Zod allows you to define a schema for your data, describing the shape, structure, and validation rules. This schema acts as a blueprint for the data, ensuring that it adheres to the expected format.
  • Type Inference: Zod's powerful type inference capabilities enable you to automatically derive TypeScript types from the defined schema. This means that the same schema used for validation can also be used to generate the corresponding types. Here's an example:
const userSchema = z.object({
  name: z.string(),
  email: z.string().email(),
  age: z.number().positive(),
});

type User = z.infer<typeof userSchema>;
Enter fullscreen mode Exit fullscreen mode

In this example, the User type is automatically inferred from the userSchema, creating a strongly typed representation of the user data.

  • Backend and Frontend Consistency: By using the same schema and inferred types across both the backend and frontend, you ensure that the data is handled consistently throughout the application. This alignment minimizes the risk of mismatches or errors when transmitting data between different parts of the system.
  • Validation and Parsing: Zod not only validates the data against the schema but also provides parsed and typed results. This feature simplifies the validation process and enhances the reliability of the code.
  • Integration with Other Tools: Zod's schemas can be easily integrated with other libraries, such as Formik for form handling, allowing for a seamless and type-safe user experience.

The use of Zod in the Next.js Auth Boilerplate project exemplifies a modern approach to type safety and data validation. By creating a shared understanding of the data structure across different layers of the application, Zod enhances maintainability and robustness. Its ability to define, validate, and infer types fosters a cohesive and efficient development process, contributing to the overall quality and reliability of the application.

Authentication Flow

The authentication flow is managed through Passport, using the Local strategy. Here's how it works:

  • Registration: Users can register with an email and password, which are stored in MongoDB after hashing with bcryptjs, the user is logged in directly.
  • Login: Users can log in with their credentials. Passport validates the credentials and creates a session.
  • Session Management: Express-session and connect-mongo handle sessions, maintaining user authentication status across requests.
  • Password Reset: If a user forgets their password, they can request a reset link sent to their email.

Email System

The email system plays a vital role in the Next.js Auth Boilerplate project by facilitating user interactions such as email verification and password resets. Below, we explore the technologies and methodologies used to implement this functionality.

Email Verification

Nodemailer: For email verification, the project uses Nodemailer, a widely-adopted module for sending emails in Node.js applications. Here's how it works:

Verification Endpoint: When the user clicks on the verification link, a request is made to a designated endpoint, validating the token and marking the email as verified.

Requesting Reset: Users can request a password reset by submitting their email address. The system generates a reset token through Brevo and sends it to the user's email address.

Resetting Password: When the user accesses the link with the reset token, they are directed to a page to reset their password. The token is validated, and the password is updated.

The project includes customizable email templates for the verification and password reset emails. These templates can be modified to align with the branding and design of your application, providing a consistent user experience.

Configuration
Both Nodemailer and Brevo are configurable, providing flexibility in setting up the email system according to your specific requirements. This includes the choice of email service provider, authentication credentials, and other settings. Here's a brief overview of the key configuration steps:

  1. Nodemailer Configuration: Nodemailer requires an SMTP (Simple Mail Transfer Protocol) service to send emails. You can choose a provider like Gmail, SendGrid, or any other that supports SMTP. The credentials for this service will need to be included in your .env file.

  2. Brevo Configuration: To use Brevo for generating and handling tokens (e.g., for email verification and password reset), you will need to set up an SMTP service on Brevo. This is where your chosen email service provider comes into play. Follow the instructions in the Brevo documentation or your chosen provider's documentation to set up the SMTP service.

  3. Environment Variables: To keep sensitive information like API keys and email credentials secure, you should store them in environment variables. Make sure to update the .env file in your project with the correct credentials for both Nodemailer and Brevo. This may include SMTP server details, API keys, email addresses, and other authentication information.

MAIL_PASSWORD=""
MAIL_USER=""
Enter fullscreen mode Exit fullscreen mode

Security Considerations
The email system is designed with security in mind. By utilizing secure tokens, HTTPS connections, and best practices for handling sensitive information, the project ensures that email-related operations are carried out securely.

The integration of Nodemailer and Brevo, along with the well-designed workflows for email verification and password resets, establishes a robust and user-friendly email system within the Next.js Auth Boilerplate project. This system not only enhances security but also adds convenience and functionality, contributing to a positive user experience.

Data Management

Data management is a crucial aspect of any application, ensuring the effective handling, validation, and storage of data. It involves managing the flow of data through the entire lifecycle, including data collection, processing, storage, and retrieval. This section will delve into the specific methods and technologies used in this application for data management.

MongoDB

We are using MongoDB and MongoDB Atlas for our database. MongoDB is a popular NoSQL database that offers high performance, scalability, and flexibility. The following explains the implementation:

MongoDB Connection

The connection to the MongoDB server is handled through the MongoClient from the MongoDB library. We've created a reusable function getMongoClient to manage the connection and another function getMongoDb to get access to the specific database:

import { MongoClient } from "mongodb"

if (!process.env.MONGODB_URI) {
  throw new Error('Invalid/Missing environment variable: "MONGODB_URI"')
}

const uri = process.env.MONGODB_URI
const options = {}

let indexesCreated = false
async function createIndexes(client: MongoClient) {
  if (indexesCreated) return client
  const db = client.db("dev")
  await Promise.all([db.collection("users").createIndexes([{ key: { email: 1 }, unique: true }])])
  indexesCreated = true
  return client
}

export async function getMongoClient() {
  // Global is used here to maintain a cached connection across hot reloads in development.
  if (!global._mongoClientPromise) {
    const client = new MongoClient(uri)
    global._mongoClientPromise = client.connect().then(async (client) => await createIndexes(client))
  }
  return await global._mongoClientPromise
}

export async function getMongoDb() {
  const mongoClient = await getMongoClient()
  return mongoClient.db("dev")
}
Enter fullscreen mode Exit fullscreen mode

Serverless Connection Handling

This implementation is tailored for a serverless environment. By caching the MongoDB client, the connection can be reused across different function invocations, preventing a new connection from being established every time. This avoids exponentially growing connections during API Route usage, which is vital for a serverless architecture.

Creating Indexes

The createIndexes function ensures that specific indexes are created on the collections when the application starts. In this example, a unique index is created on the email field of the users collection, ensuring that no two users can have the same email address.

By leveraging MongoDB and MongoDB Atlas, the application benefits from a robust and scalable data storage solution. The serverless architecture, combined with efficient connection handling and indexing, provides an effective way to manage the data within the application.

Data Fetching with SWR and Fetcher

Fetcher Explanation

The fetcher function acts as a utility for making HTTP requests using Axios. It is designed to handle requests with a consistent response format and handle errors in a standardized way.

Here's a breakdown of the function:

  • Generics: The function utilizes TypeScript generics where R is the expected response type, and T is the expected request body type. This ensures that both the input and the response are type-checked according to the specific usage of the fetcher.
  • Options Interface: An Options interface is defined to describe the method, data, and headers of the HTTP request.
export type HTTPMethod = "GET" | "POST" | "PUT" | "DELETE" | "PATCH"

export interface Options<T> {
  method: HTTPMethod
  data: T
  headers: {}
}
Enter fullscreen mode Exit fullscreen mode
  • Handling Response: The handleResponse function checks if the response status code is between 200 and 300, indicating success. If successful, it returns the data; otherwise, it returns a predefined error structure.
  • Handling Error: The handleError function handles specific error scenarios. For example, a 401 status code returns an unauthorized error. Any other errors return the error message.
  • Axios Request: The fetcher uses Axios to make the HTTP request with the provided URL and options. It then utilizes handleResponse and handleError functions to process the response.
// Generic R is the response type and T is the request body type.
export const fetcher = async <R, T>(url: string, options?: Options<T>): Promise<Response<R>> =>
  await axios
    .request({ url, ...options })
    .then(handleResponse<R>)
    .catch(handleError<R>);
Enter fullscreen mode Exit fullscreen mode

In a usage scenario, you would replace R and T with specific types according to what the request and response should contain.

export const useUser = () =>
  useSWRImmutable("/api/user", async (url) => await fetcher<Omit<UserModelSchemaType, "password"> | null, undefined>(url))

Enter fullscreen mode Exit fullscreen mode

The fetcher offers a flexible and type-safe way to make HTTP requests within your application.

Another example using the fetcher with Options

const registerUser = async (data: UserRegistrationSchemaType) => {
  setStatus("loading")

  const responseData = await fetcher<UserModelSchemaType, UserRegistrationSchemaType>("/api/users", {
    method: "POST",
    headers: { "Content-Type": "application/json" },
    data,
  })

  // Handling response
  // ...
}
Enter fullscreen mode Exit fullscreen mode

SWR explained

SWR (stale-while-revalidate) is a library used for data fetching that allows you to keep data up to date with background updates and revalidation. This boilerplate takes advantage of SWR's useSWRImmutable hook to provide immutable data fetching. The provided fetcher function is used in conjunction with SWR to standardize how requests are made.

Here's how the user data is retrieved using a custom hook:

export const useUser = () =>
  useSWRImmutable("/api/user", async (url) => await fetcher<Omit<UserModelSchemaType, "password"> | null, undefined>(url))
Enter fullscreen mode Exit fullscreen mode

Unified Response Interface

We have defined a common Response<R> interface that represents the structure of the response both on the client and server sides. This interface ensures that all responses have a consistent shape, containing a payload, an error message, and a general message string.

export interface Response<R> {
  payload: R | null;
  error: string | null;
  message: string;
}
Enter fullscreen mode Exit fullscreen mode

This pattern allows the application to handle responses uniformly, whether they represent success or failure.

Server-Side Handlers

We are using Zod to help parse the request data, ensuring that the data received matches the expected structure. This adds a layer of validation and type safety to the request handling process:

 const parsedFormInput = UserRegistrationSchema.safeParse(req.body)
Enter fullscreen mode Exit fullscreen mode

Once the data is parsed, two utility functions, handleAPIResponse and handleAPIError, are utilized to send consistent responses to the client.

handleAPIResponse: This function accepts a payload, a message, and a status code, and sends a JSON response that aligns with the Response interface. It's used to send successful responses.

export const handleAPIResponse = <T>(
  res: NextApiResponse<Response<T>>,
  payload: T,
  message: string,
  statusCode = 200
): void => {
  res.statusCode = statusCode;
  res.json({ payload, error: null, message });
};
Enter fullscreen mode Exit fullscreen mode

handleAPIError: This function is designed to send error responses, again aligning with the Response interface. It accepts an error string and a status code, ensuring that error responses are structured consistently.

export const handleAPIError = (res: NextApiResponse, error: string, statusCode = 400): void => {
  res.statusCode = statusCode;
  res.json({ payload: null, error: error, message: "An error occurred" });
};
Enter fullscreen mode Exit fullscreen mode

By using Zod for parsing and these utility functions for handling responses, the server-side code maintains a clear and consistent structure, easing both development and maintenance.

Client-Side Usage

On the front-end, the fetcher function is aware of this response structure, as it expects a Promise<Response<R>>. This alignment ensures that both the server and client handle responses in the same way, reducing the likelihood of misunderstandings between the two.

Benefits

  • Type Safety: Using TypeScript's typing system, this approach guarantees that the server sends and the client expects the same response structure, catching potential issues at compile-time.
  • Maintainability: By using the same interface across the whole codebase, any changes to the response structure only need to be made in one place. It's a pattern that promotes maintainable and scalable code.
  • Clarity and Consistency: This approach makes the code more readable and helps new team members or contributors understand how data flows through the application.

By leveraging the power of TypeScript and consistent patterns, this approach streamlines development and can significantly enhance the robustness and efficiency of the application.

Conclusion

Data management is central to the functionality and performance of the application. By utilizing MongoDB with optimized connection handling and indexing, along with standardized fetching using SWR and a fetcher function, the application ensures scalability, maintainability, and type safety. The use of TypeScript and consistent response handling further enhances the robustness and clarity of the code. This comprehensive approach to data management lays a solid foundation for the development and expansion of the application, meeting the demands of modern software development.

E2E Testing with Cypress

End-to-End (E2E) testing is a crucial part of ensuring the stability and reliability of any web application. In this project, we utilize Cypress, a popular testing tool designed for modern web applications.

Why Cypress?

Cypress offers a rich set of features and an easy-to-use interface, which makes writing and running tests a breeze. Some of the reasons why we chose Cypress include:

  • Interactive Test Runner: Cypress comes with an interactive test runner that allows you to see commands as they execute while viewing the application under test.
  • Real-Time Reloads: It automatically reloads your tests when changes are made, facilitating a smooth development workflow.
  • Debuggability: With built-in tools to help you understand what's happening inside your tests, debugging is made more accessible.
  • Cross-Browser Testing: You can run tests in various browsers to ensure compatibility.
  • Rich Ecosystem: A thriving community and a plethora of plugins and integrations enhance Cypress's capabilities.

Setting Up Cypress

Setting Up the Test Environment
Before you start testing with Cypress, ensure that you have the development server running locally. This will allow Cypress to access the application and perform the end-to-end tests.

You can start the development server with: npm run dev

Creating a Test User

Before running the tests, you must create a test user with the following credentials:

This user will be utilized in various authentication-related tests.

Running Tests
Once the test user is created, you can run the automated test suite by executing the following command:

npm run cypress-e2e

This command will start Cypress, and the tests will run in the selected browser. You can see the results in real-time and interact with the test runner to get more insights into the tests' execution.

Writing Your Own Tests

Cypress makes it easy to write your own tests. With its intuitive API and robust documentation, you can quickly start creating tests tailored to your application's specific needs.

To learn more about Cypress and explore its features, visit the official Cypress documentation.

By integrating Cypress into our testing workflow, we ensure that our application behaves as intended, providing confidence in the code's quality and performance. With its straightforward setup and ease of use, Cypress stands as an invaluable tool for developers striving to maintain a high standard of quality in their projects.

Setting up the project

Deployment Process

Deploying the Next.js Auth Boilerplate project to Vercel is a streamlined process, aimed at providing a hassle-free experience even for those new to web deployment. The following step-by-step guide outlines the deployment process, ensuring that you can get your project live in no time:

  1. Create an Account on Vercel (if you don't have one): Visit Vercel's website and sign up or log in.

  2. Create a New Project: Once logged in, click on "New Project" and select "Import Git Repository."

  3. Import the GitHub Repository: Connect your GitHub account and select the repository containing the Next.js Auth Boilerplate project. Follow the on-screen instructions to set up the project on Vercel.

  4. Configure MongoDB Atlas Integration: You'll need to link the project to your MongoDB Atlas database. On the project settings page, navigate to "Integrations" and search for the MongoDB Atlas integration. Follow the instructions to connect your database.

  5. Update Environment Variables: Modify the WEB_URI variable in your .env file to match the domain where your application will be hosted. Additionally, ensure that all the necessary environment variables are configured in the Vercel dashboard. You can refer to the provided .env.example file for a complete list of required variables.

  6. Automatic Deployment: Whenever you push changes to the main branch of your GitHub repository, Vercel will automatically trigger a deployment. The deployment process takes care of building, optimizing, and hosting your application, making it accessible to users around the world.

  7. Verify and Test: Once deployed, visit the provided URL to verify that the application is running correctly. You may want to conduct additional tests to ensure that all functionalities, including authentication and email systems, are operational.

By following this guide, you should have a working instance of the Next.js Auth Boilerplate project. Feel free to explore the additional configuration options available on Vercel to tailor the hosting environment to your specific needs.

Given the scale and complexity of this project, there might be occasional oversights or areas in the guide that could be clearer. If you encounter any issues or need further assistance, please consult the official Vercel documentation or reach out through the comments.

Thank you for taking the time to go through this guide, and happy coding!

Top comments (4)

Collapse
 
penandcode profile image
Lakshit

I found the post informative but these two buttons are not working properly:

Image description

Collapse
 
martinpersson profile image
Martin Persson

Thank you!

That's clumsy of me. I have fixed it now, thanks for pointing it out

Collapse
 
vulcanwm profile image
Medea

great post!
really helps beginners learning next.js!

Collapse
 
martinpersson profile image
Martin Persson

Thanks alot! 😊