DEV Community

shourya sharma
shourya sharma

Posted on

Simple Backend Development using Express.js and Node.js with a simple practical project

Simple Node.js Backend Development

Hello everyone. I have seen a lot of people having issues getting into backend development because backend development is really vast and a little granular. There are many small bits of knowledge you need to learn almost throughout your entire career. Backend development is obviously very broad.

It also involves a lot of theoretical concepts. However, this blog will break you out of that loop and bring you to a practical project that you will build and see for yourself what backend development is actually about. There are a bunch of theoretical concepts which many practical blogs skip, but I will cover them because I do not want to make this just about building a backend. It is about how it feels to be a backend developer, giving a bigger picture, and also adding many of my personal discoveries and perspectives into this.

Theoretical Concepts

Backend development is a process of making something. You are not memorizing something, you are engineering something. Building requires breaking.

This is important. A lot of people think it is memorizing syntax or a method, but a language is only a way to build things.

Whatever I am telling you, look at it as millions of people building their own stuff, sharing, engaging, and experimenting until they achieve their desired results. Whatever you are learning is meant to help you write better practical code. Whatever is required in the industry is simply what is best and most efficient in the modern ecosystem.

When we say technology changes, it means a gradual change toward better practices and wider availability to modern people.

That is why technology itself exists on a spectrum. Some people say PHP is dead, but the reason it is still alive and innovations are still made in PHP is because applications like Facebook are built on it. Rewriting such massive codebases in a different language is practically impossible. As a web developer, you need to know what is currently the best technology for you to learn to get a better job. When you learn, you are adding to pre existing knowledge and established practices.

Now enough with this discussion. Let us move to theory. A little theory might seem unrelated, but it is very helpful and practical.

Small Concepts

API

API stands for Application Programming Interface.

It is a set of rules and methods that allows different software applications to communicate and interact with each other.

Ports

A port is basically where we interact with our API.

Server

A server is something that serves a service. For example, if we have a calculator app, we can say that it provides a service of calculating.

Database

People often confuse servers with databases. I have seen people say things like all your data is stored in 'Facebook servers', but your data is stored in a database, and a server is responsible for serving that data in whatever way you request it.

Frontend

Frontend is basically how you interact with your app. It is always separate from the backend.

CORS

CORS stands for Cross Origin Resource Sharing. It allows your backend and frontend to communicate with each other. The data shared between them is usually in JSON format.

JSON

JSON is a data format used to store and share data.

Example

{
  "userId": 1,
  "username": "john_doe",
  "email": "john@example.com",
  "isActive": true
}

Enter fullscreen mode Exit fullscreen mode

Framework

A framework is a structured set of tools, conventions, and abstractions that helps you build software faster and in a more organized way.

In backend development, a framework takes care of many low-level details for you, such as (You will familiarise with all this later):

  • Handling incoming HTTP requests
  • Routing requests to the correct logic
  • Managing middleware
  • Providing a consistent project structure

Instead of writing everything from scratch using only a programming language or requiring wizardary of computer networking and CS fundamentals, a framework gives you a foundation to build on. You focus more on what your application should do, rather than how the underlying plumbing works.

A framework also enforces certain patterns. This is important because when multiple developers work on the same codebase, having a common structure makes the code easier to read, maintain, and scale.

Node.js?

Node.js is a runtime environment that allows you to run JavaScript outside the browser.

Originally, JavaScript was only meant to run in browsers. It was used to handle small interactions like button clicks, form validation, and simple UI logic. The browser was responsible for executing JavaScript code.

Node.js changed this by taking Google’s V8 JavaScript engine (the same engine used by Chrome) and making it available on the server. This means JavaScript can now be used to write backend code.

With Node.js, JavaScript can:

  • Create servers
  • Handle HTTP requests
  • Read and write files
  • Communicate with databases
  • Build APIs

In simple terms, Node.js makes JavaScript a backend language.
I remember making a tweet where I pointed this exact thing out
that JavaScript wasn't really meant to be a backend development language, but slowly infrastructure evolved and we got typescript and faster frameworks and faster runtime environments like bun and thats how it became what it is today

Why Node.js exists

Before Node.js, backend development was usually done using languages like Java, PHP, Python, or C#. Frontend and backend were written in different languages.

Node.js allows developers to use one language (JavaScript) for both frontend and backend. This reduces context switching and makes development faster, especially for small teams and startups.

Another important reason Node.js exists is performance.

Node.js uses an event-driven, non-blocking I/O model. This means:

  • It does not wait for slow operations (like database queries or file reads) to finish
  • It can handle many requests at the same time using a single thread

This makes Node.js very efficient for applications that deal with many concurrent users, such as APIs, chat apps, and real-time systems.

Node.js is not a framework

This is a very important distinction.

Node.js is not a framework.

It does not give you routing, middleware, or project structure.

Node.js only provides:

  • Core modules (http, fs, path, etc.)
  • Low-level APIs for networking and system access

Node.js is the foundation.

Express is a layer built on top of that foundation.

How Node.js fits in backend development

You can think of the backend stack like this:

  • JavaScript → the language
  • Node.js → the runtime environment
  • Express → the framework
  • Database → where data is stored

Each layer solves a different problem, and together they form a complete backend system.

Express?

Express is a minimal and flexible backend framework built on top of Node.js.

Node.js by itself only provides low-level APIs for handling HTTP requests and responses. While this is powerful, writing a real-world backend using only Node.js quickly becomes repetitive and hard to manage.

Express simplifies this by providing:

  • A clean and simple routing system
  • Middleware support for handling logic between request and response
  • Easy integration with databases, authentication, and other libraries
  • A lightweight core that does not force heavy abstractions

Express does not hide Node.js from you. Instead, it sits on top of it and makes common backend tasks easier and more readable.

Because of this, Express is often the first framework developers learn when getting into backend development with Node.js. It gives you enough structure to build real applications while still letting you understand what is happening under the hood.

Express is not opinionated. This means it does not force you into a strict project structure. You are free to design your backend in a way that makes sense for your application.

Environment Variables (.env)

When building backend applications, you often need to store sensitive data or configuration outside your code.

Examples include:

  • Database connection URLs
  • API keys
  • Secret keys for authentication
  • Port numbers

Instead of hardcoding these values in your code, which can get exposed (DISASTROUS SERIOUSLY), we use environment variables stored in a .env file.

Example .env file:

PORT=3000
DB_URL=mongodb://localhost:27017/chatapp
JWT_SECRET=mysecretkey
CLOUDINARY_URL=your_cloudinary_url_here

Enter fullscreen mode Exit fullscreen mode

Where do we get these variables from? Simply from the website, after logging in, say u want to use Cloudinary, simply go on their app and get yourself the URL, if its confusing, the setup process is usually always in the documentations

Using .env in Node.js

To access these variables in your application, we use the dotenv package.

  1. Install dotenv:
npm install dotenv
Enter fullscreen mode Exit fullscreen mode
  1. Load the environment variables at the top of your index.js:
import dotenv from "dotenv";

dotenv.config();

const PORT = process.env.PORT || 3000;
const DB_URL = process.env.DB_URL;
const JWT_SECRET = process.env.JWT_SECRET;

console.log("Server running on port:", PORT);
Enter fullscreen mode Exit fullscreen mode

Why use .env?

  • Keeps sensitive data out of your code
  • Makes it easy to change configuration without modifying code
  • Allows different settings for development, testing, and production

Never commit your .env file to version control (like Git). Add it to .gitignore:

# .gitignore
.env
Enter fullscreen mode Exit fullscreen mode

we will talk about this .gitignore file soon in this chapter

SOLID principles

No Need to go too deep into SOLID PRINCIPLES right now, I just wanted to give you a sneak peak into this, whatever written in this chapter is sufficient as far as this project is concerned.

SOLID principles consist of five principles. It is a very detailed topic with a lot of what we call design patterns, but we do not need them right now. We will only take a small look.

SOLID principles ensure that our code follows best practices and is written in a clean and maintainable way.

SOLID Design Principles

S. Single Responsibility Principle

A class should have one and only one reason to change, meaning it should have only one job.

O. Open Closed Principle

Software entities should be open for extension but closed for modification.

L. Liskov Substitution Principle

Subclasses should be replaceable for their base classes without breaking behavior.

I. Interface Segregation Principle

Clients should not be forced to depend on interfaces they do not use.

D. Dependency Inversion Principle

Depend on abstractions, not on concrete implementations.

This might seem like a lot right now, but just remember that these principles exist. They are rules you should follow to write good code.

For this fairly simple project, we will mainly use the Single Responsibility Principle and the Open Closed Principle.

The others are also practical but a bit less visible in small projects. What these principles basically mean is that our code should be scalable. We should not need to change old code just to add a new feature. We should not have to trace back previous features and read old parts of the codebase every time we want to add something new.

If we want to scale a business, we cannot afford to constantly touch old code instead of adding new features.

One class having one responsibility can be confusing in this project because we will not strictly use object oriented or struct based programming like in Rust. I personally see this principle as analogous to folders in JavaScript, where one folder exists for one and only one purpose.

You can study these design patterns later. It is not like you need to master them before building this project. They are just an overview of why we will write code in a certain way.

Understanding the Building Process of an API

In this section, we will understand the essential concepts behind how an API works.

We will cover:

  • Different types of functions
  • Different types of APIs
  • REST APIs
  • Request–response cycle and status codes

The two primary building blocks of an application

At its core, every application is built using two things:

  • Data
  • Functions

Data is information that exists and is usually stored in a structured format like JSON.

Functions (or methods, if you think in terms of object-oriented programming) are pieces of code that operate on this data.

How an API works

Let’s look at the basic workflow of an API:

  1. We set up an application
  2. We run it on a specific port
  3. A client sends a request to that port (the request contains a packet of data)
  4. The server receives the request and runs the appropriate code for that specific request
  5. The server sends back a response along with a status code

This request–response cycle is the foundation of backend development.

Understanding status codes

A status code is simply a number attached to the response that tells the client what happened to the request.

It might sound complicated, but a status code is just a label that describes the outcome of the request.

It is the developer’s responsibility to choose the correct status code.

For example:

  • If a user tries to access an admin-only route without proper authorization
  • Or sends a request that requires authentication without being logged in

The server can respond with an appropriate status code.

Common HTTP Status Codes

Status Code Meaning When it is used
200 OK Request was successful
201 Created A resource was successfully created
400 Bad Request Invalid or malformed request data
401 Unauthorized User is not authenticated
403 Forbidden User is authenticated but not allowed
404 Not Found Requested resource does not exist
500 Internal Server Error Something went wrong on the server

This is the basic API workflow.
As we start writing actual code, these concepts will become much clearer and more intuitive.

What is a REST API?

REST stands for Representational State Transfer.

A REST API is a way of designing APIs where:

  • Each URL represents a resource
  • The action you want to perform is defined by the HTTP method
  • Data is usually sent and received in JSON format

You can think of a REST API as a set of rules for how clients and servers should communicate.

For example:

  • /users represents users
  • /messages represents messages

Instead of creating different URLs for every action, REST uses HTTP methods to decide what should happen.

This makes APIs predictable, scalable, and easy to understand.

HTTP Methods

HTTP methods define what action you want to perform on a resource.

The most commonly used methods are:

  • GET → Fetch data
  • POST → Create new data
  • PUT → Update existing data
  • DELETE → Remove data

For example:

  • GET /messages → get all messages
  • POST /messages → create a new message
  • PUT /messages/:id → update a message
  • DELETE /messages/:id → delete a message

This separation of resource and action is what makes REST APIs clean and structured.

Writing your first API endpoint in Express

Now let’s write a very simple API endpoint.

Inside src/index.js:

import express from "express";

const app = express();
const PORT = 3000;

app.get("/", (req, res) => {
  res.status(200).json({
    message: "Server is running"
  });
});

app.listen(PORT, () => {
  console.log(`Server running on port ${PORT}`);
});
Enter fullscreen mode Exit fullscreen mode

What is happening here:

  • We create an Express app
  • We define a route using GET
  • When a request hits /, the server sends a JSON response
  • The server listens on a specific port

Testing your API (using Postman or the browser)

Now that the server is running, we need a way to test it.

When we say “hitting a URL”, what we really mean is sending an HTTP request to our server.

Since our server is listening on port 3000, the full URL becomes:

http://localhost:3000/

Enter fullscreen mode Exit fullscreen mode
Testing using a browser

The simplest way to test this endpoint is by opening a browser and visiting the URL above.

When you do this:

  • The browser sends a GET request to /
  • The server receives the request
  • The server responds with JSON

You should see something like:

{
  "message": "Server is running"
}

Enter fullscreen mode Exit fullscreen mode

This confirms that:

  • Your server is running
  • Your route is working
  • Your request–response cycle is functioning correctly
Testing using Postman

While the browser works for simple GET requests, real backend development requires better tools. This is where Postman comes in.

Postman is a tool that allows you to:

  • Send different types of HTTP requests (GET, POST, PUT, DELETE)
  • Add headers, body, and authentication
  • Inspect responses and status codes

To test this endpoint in Postman:

  1. Open Postman
  2. Select GET as the request type
  3. Enter the URL http://localhost:3000/
  4. Click Send

You should see:

  • Status code: 200 OK
  • Response body containing the JSON message

This is exactly what frontend applications do behind the scenes.

The only difference is that Postman lets you see and control everything manually.

As we build more endpoints, Postman will become one of your most important tools for testing and debugging your backend.

Connecting routes to controllers

As your application grows, writing all logic inside index.js becomes messy and hard to maintain.

To follow good practices (and the SOLID principles discussed earlier), we separate concerns.

A common pattern is:

  • Routes → define endpoints
  • Controllers → contain business logic

Example folder structure:

src/
├── controllers/
│   └── user.controller.js
├── routes/
│   └── user.routes.js
├── index.js
Enter fullscreen mode Exit fullscreen mode

Route file (user.routes.js):

import express from "express";
import { getUsers } from "../controllers/user.controller.js";

const router = express.Router();

router.get("/", getUsers);

export default router;

Enter fullscreen mode Exit fullscreen mode

Controller file (user.controller.js):

export const getUsers = (req, res) => {
  res.status(200).json({
    users: []
  });
};
Enter fullscreen mode Exit fullscreen mode

Connecting routes in index.js:

import userRoutes from "./routes/user.routes.js";
app.use("/users", userRoutes);
Enter fullscreen mode Exit fullscreen mode

Now:

  • /users is handled by the route file
  • The route calls a controller function
  • The controller sends the response

This structure makes your application:

  • Easier to read
  • Easier to scale
  • Easier to debug

As we continue building the project, this separation will become very natural.

Middleware

Middleware is one of the most important concepts in backend development.

A middleware is a function that runs between the request and the response.

Think of middleware as a checkpoint.

When a request comes in:

  1. The request hits the server
  2. Middleware runs
  3. The request reaches the route handler
  4. A response is sent back

Middleware can:

  • Modify the request
  • Stop the request
  • Pass the request forward

Example middleware:

app.use((req, res, next) => {
  console.log("Request received");
  next();
});

Enter fullscreen mode Exit fullscreen mode

If next() is not called, the request will never reach the route.

Common uses of middleware:

  • Authentication
  • Logging
  • Parsing JSON
  • Error handling

Authentication Flow

Authentication is the process of verifying who a user is.

A typical authentication flow looks like this:

  1. User sends login credentials (email, password)
  2. Server verifies the credentials
  3. Server generates a token
  4. Token is sent back to the client
  5. Client sends the token with future requests
  6. Server verifies the token before allowing access

This allows the server to know who is making the request without asking for credentials every time.

Authentication logic is usually implemented using middleware.

Passing Parameters and Query Strings

When building APIs, you often need to send extra information along with a request.

Express supports two main ways to do this:

  1. Route Parameters (Params)
  2. Query Parameters (Query Strings)

Both are used for different purposes, and understanding the difference is very important.

Route Parameters (Params)

Route parameters are part of the URL path itself.
They are commonly used to identify a specific resource, such as a user, message, or post.
Example route:

GET /users/:id
Enter fullscreen mode Exit fullscreen mode

Here, :id is a route parameter.

Defining a route with params

app.get("/users/:id", (req, res) => {
  const userId = req.params.id;

  res.status(200).json({
    userId
  });
});
Enter fullscreen mode Exit fullscreen mode

Example request

GET /users/42
Enter fullscreen mode Exit fullscreen mode

What happens

  • 42 gets captured by :id
  • You access it using req.params.id
req.params // { id: "42" }

Enter fullscreen mode Exit fullscreen mode

Examples:

  • GET /messages/10
  • DELETE /users/5
  • PUT /posts/99

Query Parameters (Query Strings)

Query parameters are optional key–value pairs added after ? in the URL.
They are usually used for:

  • Filtering
  • Searching
  • Pagination
  • Sorting

Example URL:

GET /messages?limit=10&page=2
Enter fullscreen mode Exit fullscreen mode

Accessing query parameters

app.get("/messages", (req, res) => {
  const { limit, page } = req.query;

  res.status(200).json({
    limit,
    page
  });
});
Enter fullscreen mode Exit fullscreen mode

What Express receives

req.query // { limit: "10", page: "2" }
Enter fullscreen mode Exit fullscreen mode

Query values are always strings by default.

JWT (JSON Web Token) — Explained Intuitively

JWT stands for JSON Web Token.
A JWT is a string that represents a user’s identity.
You can think of it like a digital ID card.
Once the user logs in:

  • The server creates a token
  • The token contains encoded information (like user ID)
  • The token is signed so it cannot be tampered with

On future requests:

  • The client sends the token
  • The server verifies it
  • If valid, the request is allowed

The server does not store session data.

Everything needed is inside the token itself.

This makes JWT-based authentication:

  • Stateless
  • Scalable
  • Fast

Database Models with Mongoose

A database stores your application’s data.

Since we are using MongoDB, we use Mongoose to interact with the database.

MongoDB : MongoDB is a noSQL (not only SQL) Database that stores data in form of document that are of json format
Schema : The format of the data to be stored

Mongoose allows us to:

  • Define the shape of our data
  • Enforce rules and validations
  • Interact with the database using JavaScript

A model represents a collection in the database.

Example user model:

import mongoose from "mongoose";

const userSchema = new mongoose.Schema({
  username: {
    type: String,
    required: true
  },
  email: {
    type: String,
    required: true,
    unique: true
  },
  password: {
    type: String,
    required: true
  }
});

export const User = mongoose.model("User", userSchema);
Enter fullscreen mode Exit fullscreen mode

This model defines:

  • What fields a user has
  • What type of data is allowed
  • What rules must be followed

Models are used inside controllers to:

  • Create data
  • Read data
  • Update data
  • Delete data

This is how your API connects logic to persistent storage.

Setup Process

Initializing a project

To start a new Node.js project, open your terminal in an empty folder and run:

npm init -y
Enter fullscreen mode Exit fullscreen mode

This command initializes a new Node.js project.
After running it, a file called package.json will appear in your project directory.

What is package.json?

I like to think of package.json as the ingredient list for your project.

It contains all the information required for your application to run.

If this file does not exist, your Node.js application cannot properly run or manage its dependencies.

package.json stores:

  • Project information
  • Scripts to run your application
  • Dependencies your project relies on

Example package.json

Below is a dummy package.json file with some example dependencies:

{
  "name": "simple-node-backend",
  "version": "1.0.0",
  "description": "A simple backend project using Node.js and Express",
  "main": "index.js",
  "scripts": {
    "start": "node index.js",
    "dev": "nodemon index.js"
  },
  "author": "Your Name",
  "license": "MIT",
  "dependencies": {
    "express": "^4.19.0",
    "cors": "^2.8.5"
  },
  "devDependencies": {
    "nodemon": "^3.0.0"
  }
}
Enter fullscreen mode Exit fullscreen mode

Understanding package.json

There are a few important things to understand here.

1. Metadata

Metadata includes information about your project such as:

  • Project name
  • Version
  • Description
  • Author
  • License
  • Entry file (main)

This information helps tools, developers, and deployment platforms understand your project.

1.1 Scripts

Scripts are shortcuts for running commands.

Earlier, you ran npm init, which is a command.

In real projects, some commands can become long and repetitive.

Instead of typing the full command every time, you can define it once in the scripts section and give it a name.

For example:

"scripts": {
  "start": "node index.js"
}
Enter fullscreen mode Exit fullscreen mode

Now, instead of writing the full command, you can simply run:

npm run start
Enter fullscreen mode Exit fullscreen mode

This makes working with your project much easier and cleaner.

2. Dependencies

Dependencies are external code that your project depends on.

At the very beginning, your project only has a package.json file. You can already write and run JavaScript code at this point.

However, when building real applications, a lot of functionality is repetitive across projects.

For example:

  • Authentication
  • Input validation
  • Security headers
  • Request handling

These problems have already been solved by other developers.
Does that mean you need to learn cybersecurity from scratch just to add authentication?

No.
Instead, developers share reusable code as packages.
This is where npm comes in.

What is npm?

npm stands for Node Package Manager.

It is a registry and a tool where developers publish packages that other developers can reuse. These packages are available on the npm website and can be installed into your project.

When you install a package, it gets added to your dependencies, and your project can now use that code.

Installing packages with npm

We can very easily install a dependency package and bring it into our local project.

Each package has an official page on the npm website where you can:

  • Read documentation
  • See usage examples
  • Find the installation command

For example, to install Express, you run:

npm install express
Enter fullscreen mode Exit fullscreen mode

After running this command:

  • The package is downloaded to your local machine
  • It is placed inside a folder called node_modules
  • The package name is added to package.json under dependencies

This means your project now depends on Express.

What is node_modules?

node_modules is a folder that contains all the installed dependencies for your project.

It includes:

  • The packages you directly installed
  • The dependencies of those packages
  • The dependencies of those dependencies, and so on This folder can become very large because modern applications rely on many small packages. You should never manually edit anything inside node_modules.

What if node_modules is deleted?

Deleting node_modules is completely safe.
In fact, many developers delete it intentionally to:

  • Fix dependency issues
  • Reduce project size
  • Reset the project environment

If node_modules is deleted, your project will not run because the dependencies are missing.
To restore it, simply run:

npm install
Enter fullscreen mode Exit fullscreen mode

npm will read the package.json file and reinstall all required dependencies automatically.
This is why package.json is so important.

How dependencies are added

There are two common ways to add dependencies to a project.

1. Installing directly using npm

You can install a package by running:

npm install express
Enter fullscreen mode Exit fullscreen mode

npm will automatically:

  • Download the package
  • Add it to node_modules
  • Update package.json with the dependency This is the most common and recommended approach. #### 2. Adding the dependency manually You can also manually add a dependency name to package.json, like this:
"dependencies": {
  "express": "^4.19.0"
}
Enter fullscreen mode Exit fullscreen mode

After doing this, you must run:

npm install
Enter fullscreen mode Exit fullscreen mode

npm will read the updated package.json and install the missing dependencies into node_modules.
Both approaches achieve the same result.

Socket IO

Socket IO is a library that enables real-time, bidirectional communication between the server and clients (like web browsers).

Unlike traditional HTTP requests where the client asks and the server responds once, Socket IO creates a persistent connection that stays open. This allows the server to push updates to clients instantly without waiting for requests.

Why Socket IO Exists

HTTP is request-response based—great for fetching data, but terrible for real-time features. Imagine a chat app: constantly polling "any new messages?" wastes resources and feels laggy.

Socket IO solves this with WebSockets (the technology behind persistent connections) plus smart fallbacks. The server can send messages to specific clients, rooms, or everyone instantly.

Common use cases:

  • Chat applications
  • Live notifications
  • Collaborative editing (Google Docs)
  • Gaming
  • Live dashboards

Socket IO vs HTTP

Feature HTTP (Express) Socket IO
Connection Short-lived (request → response) Persistent (always connected)
Direction Client → Server → Client Both directions, anytime
Real-time No (polling needed) Yes (instant)
Use case Fetch data, forms Live updates, chat

Socket IO sits on top of Express—you still use HTTP for login/register, but Socket IO handles live messaging.

How Socket IO Works

  1. Client connects to server (handshake)
  2. Persistent connection established
  3. Client emits events → server receives instantly
  4. Server emits events → client receives instantly
  5. Connection stays alive until closed

Think of it like a phone call vs texting:

  • HTTP = Send text, wait for reply
  • Socket IO = Phone call, talk anytime

Core Concepts

Events: Socket IO uses named events like radio stations. Anyone "tuned" to that event receives messages.

// Client sends "join_room"
socket.emit("join_room", { room: "general" });

// Server receives and responds
socket.on("join_room", (data) => {
  socket.join(data.room);
  socket.emit("joined", "Welcome!");
});

Enter fullscreen mode Exit fullscreen mode

Rooms: Group clients logically (chat rooms, game lobbies).

Namespaces: Separate Socket IO servers on same port (chat vs notifications).

Socket IO with Express (Integration)

Socket IO needs an HTTP server (not just Express app). Here's the exact pattern from your lib/socket.js:

import { Server } from "socket.io";
import http from "http";
import express from "express";

const app = express();  // Express app
const server = http.createServer(app);  // HTTP server wrapping Express
const io = new Server(server, {     // Socket IO attaches to HTTP server
  cors: {
    origin: "http://localhost:5173",  // Your frontend
    credentials: true
  }
});

export { io, app, server };  // Export all three!

Enter fullscreen mode Exit fullscreen mode

Why this structure?

  • app → Your Express routes/middleware
  • server → Listens on port (not app.listen())
  • io → Socket IO instance

In index.js, import and use:

import { app, server } from "./lib/socket.js";
server.listen(PORT, () => { ... });  // server, not app!
Enter fullscreen mode Exit fullscreen mode

Client-Side Setup

Frontend connects automatically:

// In React/Vite (port 5173)
import { io } from "socket.io-client";

const socket = io("http://localhost:3000");  // Backend URL

socket.on("message", (data) => {
  console.log("New message:", data);
});

socket.emit("send_message", { text: "Hello!" });
Enter fullscreen mode Exit fullscreen mode

Chat App Example (Your Project)

For your chat backend:

  1. User authenticates via HTTP → gets JWT
  2. Socket connects → sends JWT for verification
  3. Join roomsocket.join(user.room)
  4. Send messageio.to(room).emit("new_message", msg)
  5. Everyone in room receives instantly
// Server-side (middleware/auth example)
io.use((socket, next) => {
  const token = socket.handshake.auth.token;
  // Verify JWT...
  next();
});

socket.on("send_message", async (data) => {
  // Save to MongoDB
  const message = new Message({ text: data.text, userId: user.id });
  await message.save();

  // Broadcast to room
  io.to(data.room).emit("new_message", message);
});

Enter fullscreen mode Exit fullscreen mode

Key Benefits for Your Chat App

  • Real-time: Messages appear instantly
  • Rooms: Private/group chats
  • Scalable: Works with your JWT/Mongoose setup
  • Fallbacks: Works on old browsers/networks
  • Typed events: Clean like Express routes

Socket IO handles connection management, heartbeats, and reconnection automatically. You focus on chat logic.

Making an App

at this point we can do two main things:

  • Initialize an application
  • Install packages

With just these two steps, we can build entire backend applications.

Creating the server with Express

We have already discussed what Express is. Since we want our application to run on a port and accept requests, we need to create a server using Express.

First, install Express:

npm install express
Enter fullscreen mode Exit fullscreen mode

Project structure

Now create the following structure:

project-root/
├── src/
│   └── index.js
├── package.json
└── node_modules/
Enter fullscreen mode Exit fullscreen mode

Updating package.json

Next, update the scripts field in your package.json:

"scripts": {
  "dev": "node src/index.js"
}
Enter fullscreen mode Exit fullscreen mode

This ensures that when you run:

npm run dev
Enter fullscreen mode Exit fullscreen mode

the index.js file inside the src folder is executed.

Congratulations
You have now created the foundation of your backend application.

Organizing your code

You can technically put all your code inside index.js and call it a day.

But this is a bad practice.

As discussed earlier in the SOLID principles section, code that is not organized becomes:

  • Hard to scale
  • Hard to maintain
  • Hard to understand as the project grows

Instead of writing everything in one file, we split our code into multiple files and folders.

Each file or folder has:

  • A single responsibility
  • One type of functionality

We will discuss this structure in detail later.
For now, just understand this important idea:

Even though the code is split across many files, everything eventually connects back to index.js.
index.js acts as the entry point of your application.
There are some exceptions, such as special scripts like database seeding (do not worry, we will cover what seeding is later). Apart from those cases, almost all application logic can be traced back to index.js.

This approach makes development easier, cleaner, and scalable as your application grows.

Here’s a short and clean version you can drop in without breaking the flow:

Using ES Modules (type: "module")

By default, Node.js uses CommonJS (require). Modern JavaScript uses ES Modules (import / export).
To enable ES Modules, add this to your package.json:

{
  "type": "module"
}
Enter fullscreen mode Exit fullscreen mode

This tells Node.js to treat all .js files as ES Modules.
Now you can write:

import express from "express";
Enter fullscreen mode Exit fullscreen mode

instead of:

const express = require("express");
Enter fullscreen mode Exit fullscreen mode

Most modern Node.js projects use this approach, and we will use it throughout this project.

Starting Building Our Application

Finally, we can start working on our application.

For this blog, our app will be a small chat application backend with authentication. We will understand a lot of concepts while building this project, so stay tuned.

First, we will install all the required dependencies. At this stage, it is completely okay if you do not understand what every package does. Some of these might already sound familiar, and others might not. We will cover each of them later when we actually use them.

Installing dependencies

You have two options.

Option 1: Copy package.json

You can copy and paste the following into your package.json file and then run:

{
  "name": "backend",
  "version": "1.0.0",
  "description": "",
  "main": "src/index.js",
  "scripts": {
    "dev": "nodemon src/index.js",
    "start": "node src/index.js"
  },
  "keywords": [],
  "author": "",
  "type": "module",
  "license": "ISC",
  "dependencies": {
    "bcryptjs": "^2.4.3",
    "cloudinary": "^2.5.1",
    "cookie-parser": "^1.4.7",
    "cors": "^2.8.5",
    "dotenv": "^16.4.5",
    "express": "^4.21.1",
    "jsonwebtoken": "^9.0.2",
    "mongoose": "^8.8.1",
    "socket.io": "^4.8.1"
  },
  "devDependencies": {
    "nodemon": "^3.1.7"
  }
}
Enter fullscreen mode Exit fullscreen mode

Bash Command :

npm install
Enter fullscreen mode Exit fullscreen mode

Option 2: Install everything using npm

You can also install all dependencies directly using npm:

npm install express mongoose cors dotenv cookie-parser bcryptjs jsonwebtoken cloudinary socket.io
npm install -D nodemon
Enter fullscreen mode Exit fullscreen mode

Do not worry about what each package does right now. We will understand them one by one as we build the application.

Dev Dependencies

Dev dependencies are packages that are only needed during development, not in production.

A common example is nodemon.

nodemon automatically restarts your server whenever you change a file. Without it, you would have to stop and restart the server manually every time you make a change.

This makes development faster and more convenient.

That is why nodemon is added under devDependencies.

Building the project

backend/
└── src/
    ├── controllers/
    ├── lib/
    ├── middleware/
    ├── models/
    ├── routes/
    ├── seeds/
    └── index.js
Enter fullscreen mode Exit fullscreen mode

Folder & File Responsibilities

Folder / File Purpose
index.js Entry point; sets up middleware, routes, and server
routes/ Defines API endpoints and maps URLs to controllers
controllers/ Handles business logic and communicates with models
models/ Defines database schemas and handles DB operations
middleware/ Runs pre-controller logic like auth and validation
lib/ Stores shared utilities and helper functions
seeds/ Inserts dummy/test data; run manually for development

index.js

// =========================================
// IMPORTS
import express from "express";
import dotenv from "dotenv";
import cookieParser from "cookie-parser";
import cors from "cors";
import path from "path";
import { connectDB } from "./lib/db.js";
import authRoutes from "./routes/auth.route.js";
import messageRoutes from "./routes/message.route.js";
import { app, server } from "./lib/socket.js";
// END - IMPORTS
// =========================================

// =========================================
// CONSTANTS
const PORT = process.env.PORT;
const __dirname = path.resolve();
// END - CONSTANTS
// =========================================

// main function
function main () {
  // =========================================
  // DOTENV SETUP
  dotenv.config();
  // ENV - DOTENV SETUP
  // =========================================

  // =========================================
  // MIDDLEWARE

  // to get json
  app.use(express.json());
  // END - to get json

  // to pass cookies
  app.use(cookieParser());
  // END - to pass cookie

  // to share resouces to frontend
  app.use(
    cors({
      origin: "http://localhost:5173",
      credentials: true,
    })
  );
  // END - to share resouces to frontend

  // ROUTES MIDDLEWARE
  app.use("/api/auth", authRoutes);
  app.use("/api/messages", messageRoutes);
  // END - ROUTES MIDDLEWARE

  // END - MIDDLEWARE
  // =========================================


  // =========================================
  if (process.env.NODE_ENV === "production") {
    app.use(express.static(path.join(__dirname, "../frontend/dist")));

    app.get("*", (req, res) => {
      res.sendFile(path.join(__dirname, "../frontend", "dist", "index.html"));
    });
  }
  // =========================================

  // SERVER LISTEN
  server.listen(PORT, () => {
    console.log("server is running on PORT:" + PORT);
    connectDB();
  });
  // END - SERVER LISTEN


}

main();
// END - main function
Enter fullscreen mode Exit fullscreen mode

Core Constants and Setup

const PORT = process.env.PORT; const __dirname = path.resolve();

These lines define essential constants. PORT pulls the server port from environment variables for flexibility across environments. path.resolve() without arguments returns the absolute path to the current working directory (e.g., C:\Users\Shourya\projects\chat-backend), serving as the base path for file operations.

The main() function wrapper is optional but organizes startup logic cleanly. All middleware, routes, and server initialization execute sequentially when main() runs.

dotenv.config();configures the environment variables.

Production Environment: Serving Frontend Static Files


  if (process.env.NODE_ENV === "production") {
    app.use(express.static(path.join(__dirname, "../frontend/dist")));

    app.get("*", (req, res) => {
      res.sendFile(path.join(__dirname, "../frontend", "dist", "index.html"));
    });
  }
Enter fullscreen mode Exit fullscreen mode

This block enables a single-backend deployment strategy, eliminating the need for separate frontend and backend servers in production.

How it works:

  1. Static File Serving: express.static(path.join(__dirname, "../frontend/dist")) serves all files from the frontend's dist folder (built Vite/React output) as static assets. This handles CSS, JS bundles, images, etc. directly from the backend.
  2. Single Page Application (SPA) Routing: The app.get("*", ...) catch-all route serves index.html for any unmatched path. This supports client-side routing in React Router or Vite, where the frontend handles paths like /chat/:id without backend route conflicts.

Why only in production?

  • Development: NODE_ENV !== "production" skips this block. Vite dev server (localhost:5173) handles frontend separately with hot reloading.
  • Production: NODE_ENV === "production" activates static serving for a streamlined deployment. Build your frontend (npm run build), place dist in ../frontend/dist, and the backend serves everything from one port.

Path Resolution Example:

  • Backend: chat-backend/index.js
  • Frontend build: chat-backend/../frontend/distfrontend/dist
  • Result: Single server at yourdomain.com serves API (/api/*) and frontend.

This creates a production-ready, unified deployment while preserving development workflow flexibility.

Triggering Server and Adding Routes

One thing you may notice is that there is no app = express() initialization in the main entry file. That’s because the Express app is initialized inside the socket setup file and then imported wherever it’s needed. We’ll fully understand this when we reach the socket setup section, but for now, just know that app is being imported from the socket file, not created again.

Here’s a quick sneak peek to give you a clear picture:

import { Server } from "socket.io";
import http from "http";
import express from "express";

const app = express();
const server = http.createServer(app);
const io = new Server(server, {
  cors: {
    origin: ["http://localhost:5173"],
  },
});

// .............
// OTHER LOGIC WE WILL SEE HERE
// ............

export { io, app, server };

Enter fullscreen mode Exit fullscreen mode

In this file:

  • We initialize the Express app
  • Create an HTTP server using that app
  • Create a SocketIO server on top of the HTTP server
  • Finally, we export app, server, and io so they can be reused across the application

This allows both HTTP requests (REST APIs) and WebSocket connections to run on the same server.

Middlewares and Routes

Below is where we configure our middlewares and routes on the imported app instance.

app.use(express.json());
app.use(cookieParser());
Enter fullscreen mode Exit fullscreen mode
  • express.json() parses incoming JSON request bodies and makes the data available on req.body
  • cookieParser() parses cookies from incoming requests and attaches them to req.cookies
app.use(
  cors({
    origin: "http://localhost:5173",
    credentials: true,
  })
);
Enter fullscreen mode Exit fullscreen mode

This enables CORS (Cross-Origin Resource Sharing), allowing the frontend (running on http://localhost:5173) to communicate with the backend.

The credentials: true option allows cookies and authentication headers to be sent along with requests.

app.use("/api/auth", authRoutes);
app.use("/api/messages", messageRoutes);

Enter fullscreen mode Exit fullscreen mode

Here, we import route modules and mount them on specific base paths:

  • All authentication-related endpoints are handled under /api/auth
  • All message-related endpoints are handled under /api/messages

routes/

Routes are responsible for defining endpoints only.

They do not contain business logic. Their job is simply to connect:

  • an HTTP method
  • a URL path
  • optional middlewar
  • the correct controller function

routes/auth.routes.js

import express from "express";
import {
  checkAuth,
  login,
  logout,
  signup,
  updateProfile
} from "../controllers/auth.controller.js";
import { protectRoute } from "../middleware/auth.middleware.js";

const router = express.Router();

router.post("/signup", signup);
router.post("/login", login);
router.post("/logout", logout);
router.put("/update-profile", protectRoute, updateProfile);
router.get("/check", protectRoute, checkAuth);

export default router;

Enter fullscreen mode Exit fullscreen mode

There is very little logic inside a routes file, and that’s intentional.

In this file we only do four things:

  1. Choose the HTTP method

    (GET, POST, PUT, etc.)

  2. Define the route path

    (/signup, /login, /update-profile, etc.)

  3. Attach the controller

    The controller contains the actual logic (database calls, validation, responses).

  4. Attach middleware when needed

    protectRoute is used on routes that require authentication, such as checking auth status or updating the user profile.

This keeps routing clean and predictable, while controllers handle the heavy work.

routes/message.routes.js

// IMPORT
import express from "express";
import { protectRoute } from "../middleware/auth.middleware.js";
import {
  getMessages,
  getUsersForSidebar,
  sendMessage
} from "../controllers/message.controller.js";
// END - IMPORT

const router = express.Router();

router.get("/users", protectRoute, getUsersForSidebar);
router.get("/:id", protectRoute, getMessages);
router.post("/send/:id", protectRoute, sendMessage);

export default router;

Enter fullscreen mode Exit fullscreen mode

This routes file follows the exact same pattern:

  • Every route is protected using protectRoute because messaging requires an authenticated user
  • The route file does not care how messages are fetched or sent
  • It simply maps URLs to controllers

For example:

  • GET /users → fetch users for the sidebar
  • GET /:id → fetch messages for a specific user
  • POST /send/:id → send a message to a specific user

models/

The models layer defines how our data looks and how it is stored in the database.

In this project, models are written using Mongoose, which is an ODM (Object Data Modeling) library for MongoDB.

MongoDB is a NoSQL, document-based database, meaning it stores data as JSON-like objects instead of tables and rows.

Mongoose sits on top of MongoDB and lets us:

  • Define a schema (structure of the data)
  • Add validation rules
  • Create models that we use to interact with the database

Models are the blueprint of your database. They define what data exists, how it is validated, and how documents relate to each other, while keeping the rest of the application clean and maintainable.

There is very little logic inside models. Their purpose is to describe data, not control application flow.

MongoDB supports many advanced features, but our project keeps things simple and clean, using only what we need.

models/message.model.js

import mongoose from "mongoose";

const messageSchema = new mongoose.Schema(
  {
    senderId: {
      type: mongoose.Schema.Types.ObjectId,
      ref: "User",
      required: true,
    },
    receiverId: {
      type: mongoose.Schema.Types.ObjectId,
      ref: "User",
      required: true,
    },
    text: {
      type: String,
    },
    image: {
      type: String,
    },
  },
  { timestamps: true }
);

const Message = mongoose.model("Message", messageSchema);

export default Message;
Enter fullscreen mode Exit fullscreen mode

This schema defines the structure of a chat message.

  • senderId

    Stores the ID of the user who sent the message.

    It references the User model using MongoDB’s ObjectId.

  • receiverId

    Stores the ID of the user receiving the message.

    This also references the User model.

  • text

    Holds the message text (optional).

  • image

    Stores an image URL if the message contains an image (optional).

  • timestamps: true

    Automatically adds createdAt and updatedAt fields to each document.

Finally, we convert the schema into a Mongoose model, which allows us to:

  • Create messages
  • Read messages
  • Update or delete messages

models/user.model.js

import mongoose from "mongoose";

const userSchema = new mongoose.Schema(
  {
    email: {
      type: String,
      required: true,
      unique: true,
    },
    fullName: {
      type: String,
      required: true,
    },
    password: {
      type: String,
      required: true,
      minlength: 6,
    },
    profilePic: {
      type: String,
      default: "",
    },
  },
  { timestamps: true }
);

const User = mongoose.model("User", userSchema);

export default User;
Enter fullscreen mode Exit fullscreen mode

This schema defines the structure of a user document.

  • email Must be unique and is required for authentication.
  • fullName Stores the user’s display name.
  • password Stores the hashed password (never plain text). A minimum length of 6 characters is enforced.
  • profilePic Stores the profile image URL, defaulting to an empty string.
  • timestamps: true Automatically tracks when the user was created and last updated.

/lib

The /lib folder is where we set up and configure third-party libraries.

Most of the code here is boilerplate, meaning it’s code you don’t invent yourself—you usually copy it from official documentation and adjust environment variables.

The idea is simple:

  • Configure a library once
  • Export it or helper functions
  • Reuse it anywhere in the app

There is almost no business logic in this folder—only setup and small utility helpers.

lib/cloudinary.js

import { v2 as cloudinary } from "cloudinary";
import { config } from "dotenv";

config();

cloudinary.config({
  cloud_name: process.env.CLOUDINARY_CLOUD_NAME,
  api_key: process.env.CLOUDINARY_API_KEY,
  api_secret: process.env.CLOUDINARY_API_SECRET,
});

export default cloudinary;

Enter fullscreen mode Exit fullscreen mode

This file configures Cloudinary, a service used to store images in the cloud.

What’s happening here:

  • We import the Cloudinary SDK
  • Load environment variables using dotenv
  • Pass Cloudinary credentials from .env
  • Export the configured Cloudinary instance

Once this is done, we can use Cloudinary anywhere in the app without reconfiguring it.

Example usage from message logic:

if (image) {
  const uploadResponse = await cloudinary.uploader.upload(image);
  imageUrl = uploadResponse.secure_url;
}

Enter fullscreen mode Exit fullscreen mode

Here:

  • The image is uploaded to Cloudinary
  • Cloudinary returns metadata about the upload
  • We store secure_url, which is the hosted image URL

lib/db.js

This file is responsible for connecting the backend to MongoDB.

import mongoose from "mongoose";

export const connectDB = async () => {
  try {
    const conn = await mongoose.connect(process.env.MONGO_URL);
    console.log(`mongoose connected ${conn.connection.host}`);
  } catch (error) {
    console.log("mongodb connection error", error);
  }
};

Enter fullscreen mode Exit fullscreen mode

What this does:

  • Uses Mongoose to connect to MongoDB
  • Reads the database URL from environment variables
  • Logs a success message when connected
  • Catches and logs errors if the connection fails

This function is called once when the server starts.

lib/generateToken.js

This file is a utility helper used to generate and send a JWT token.

import jwt from "jsonwebtoken";

export const generateToken = (userId, res) => {
  const token = jwt.sign(
    { userId },
    process.env.JWT_SECRET,
    { expiresIn: "7d" }
  );

  res.cookie("jwt", token, {
    maxAge: 7 * 24 * 60 * 60 * 1000,
    httpOnly: true,
    sameSite: "strict",
    secure: process.env.NODE_ENV !== "development",
  });

  return token;
};

Enter fullscreen mode Exit fullscreen mode

What’s happening step by step:

  • We pass the user ID and a secret key
  • jwt.sign() converts the payload into a signed token
const token = jwt.sign(
  { userId },
  process.env.JWT_SECRET,
  { expiresIn: "7d" }
);

Enter fullscreen mode Exit fullscreen mode
  • The token is stored inside an HTTP-only cookie
  • The cookie:

    • Expires after 7 days
    • Cannot be accessed by JavaScript (httpOnly)
    • Is protected against CSRF (sameSite: "strict")
    • Uses HTTPS in production
res.cookie("jwt", token, {
  maxAge: 7 * 24 * 60 * 60 * 1000,
  httpOnly: true,
  sameSite: "strict",
  secure: process.env.NODE_ENV !== "development",
});

Enter fullscreen mode Exit fullscreen mode

lib/socket.js

This file is responsible for setting up real-time communication using Socket.IO and sharing the same Express app and HTTP server across the backend.

// IMPORT
import { Server } from "socket.io";
import http from "http";
import express from "express";
// END - IMPORT

// WORKFLOW
// MAKE A CONNECTION TO THE SERVER U WANT TO SHARE RESOUSES TO
// AND THEN CREATE AN EMPTY SET TO STORE USERS IN
// WHEN WE WILL CONNECT A CALLBACK WILL RUN
// THAT WILL STORE ALL ONLINE USER ID TO THE SET
// AND WHEN WE DISCONNECT THE SET EMPTIES


// APP AND CREATING SERVER
const app = express();
const server = http.createServer(app);
const io = new Server(server, {
  cors: {
    origin: ["http://localhost:5173"],
  },
});
// APP AND CREATING SERVER

// TO GET THE RECIEVER SOCKER ID (PARAMETER : USER ID)
export function getReceiverSocketId(userId) {
  return userSocketMap[userId];
}
// END - TO GET THE RECIEVER SOCKER ID (PARAMETER : USER ID)

// SET OF USERS
const userSocketMap = {}; 
// END - SET OF USERS

// WE TOOK IO ID AND CREATING A CONNECTION
io.on("connection", (socket) => {
  console.log("A user connected", socket.id);

  // CREATED A HANDSHAKE
  const userId = socket.handshake.query.userId;
  // IF USER ID EXIST, THEN ADD USERID IS EQUAL TO SOCKET ID
  if (userId) userSocketMap[userId] = socket.id;
  // END - IF USER ID EXIST, THEN ADD USERID IS EQUAL TO SOCKET ID
  // END - CREATED A HANDSHAKE

  // WE ARE CONNECTING TO ALL USER THAT ARE ONLINE
  io.emit("getOnlineUsers", Object.keys(userSocketMap));
  // END - WE ARE CONNECTING TO ALL USER THAT ARE ONLINE

  // WHEN WE WANNA DISCONNECT IT WILL DELETE THE USERS
  socket.on("disconnect", () => {
    console.log("A user disconnected", socket.id);
    delete userSocketMap[userId];
    io.emit("getOnlineUsers", Object.keys(userSocketMap));
  });
  // END - WHEN WE WANNA DISCONNECT IT WILL DELETE THE USERS

});
// END - WE TOOK IO ID AND CREATING A CONNECTION


export { io, app, server };
Enter fullscreen mode Exit fullscreen mode

Its responsibilities are:

  • Create and export the Express app
  • Create an HTTP server
  • Attach Socket.IO to that server
  • Track online users
  • Expose helpers to interact with sockets from anywhere in the app
// IMPORT
import { Server } from "socket.io";
import http from "http";
import express from "express";
// END - IMPORT
Enter fullscreen mode Exit fullscreen mode

Here:

  • express creates the main app
  • http is used to create a raw HTTP server
  • socket.io needs access to the HTTP server to enable real-time connections

Creating the App and Server

const app = express();
const server = http.createServer(app);
const io = new Server(server, {
  cors: {
    origin: ["http://localhost:5173"],
  },
});
Enter fullscreen mode Exit fullscreen mode

What’s happening here:

  • We create a single Express app
  • Wrap it inside an HTTP server
  • Attach Socket.IO to that server
  • Enable CORS so the frontend can connect

This is important:
Because the server is created here, we export app and server from this file and import them elsewhere (like index.js).
This ensures HTTP requests and WebSocket connections share the same server.

Tracking Online Users

const userSocketMap = {};
Enter fullscreen mode Exit fullscreen mode

This object acts as an in-memory store:

  • Key → userId
  • Value → socket.id

Example:

{
  "123": "af8sdf9as8df",
  "456": "asd9f8as7df"
}
Enter fullscreen mode Exit fullscreen mode

This lets us know:

  • Which users are online
  • Which socket belongs to which user

Handling Socket Connections

io.on("connection", (socket) => {
  console.log("A user connected", socket.id);
Enter fullscreen mode Exit fullscreen mode

This callback runs every time a user connects via Socket.IO.

  • socket.id is a unique identifier for that connection

Handshake & User Identification

const userId = socket.handshake.query.userId;
if (userId) userSocketMap[userId] = socket.id;
Enter fullscreen mode Exit fullscreen mode

Here:

  • The frontend sends userId during connection
  • We extract it from the handshake query
  • If it exists, we map:
  userId → socket.id
Enter fullscreen mode Exit fullscreen mode

This is how we associate authenticated users with sockets.

Broadcasting Online Users

io.emit("getOnlineUsers", Object.keys(userSocketMap));
Enter fullscreen mode Exit fullscreen mode

What this does:

  • Sends a list of currently online user IDs
  • Broadcasts it to all connected clients

This allows the frontend to:

  • Show online/offline status
  • Update presence in real time

Handling Disconnects

socket.on("disconnect", () => {
  console.log("A user disconnected", socket.id);
  delete userSocketMap[userId];
  io.emit("getOnlineUsers", Object.keys(userSocketMap));
});
Enter fullscreen mode Exit fullscreen mode

When a user disconnects:

  • Their socket entry is removed
  • The updated online users list is broadcast again

This keeps the online state always accurate.

Helper: Get Receiver Socket ID

export function getReceiverSocketId(userId) {
  return userSocketMap[userId];
}
Enter fullscreen mode Exit fullscreen mode

This helper is used elsewhere (like message controllers):

  • Given a userId
  • Returns their active socket.id (if online)

This enables targeted real-time events, such as sending a message notification to a specific user.

Exports

export { io, app, server };
Enter fullscreen mode Exit fullscreen mode

What we export:

  • app → used to register routes & middleware
  • server → used to start listening on a port
  • io → used to emit socket events from anywhere

controller/

The controller layer is where the actual application logic lives.

When a request hits an endpoint, this is the final stop after:

  1. Route matching
  2. Middleware checks (auth, validation, etc.)

Controllers:

  • Read data from req
  • Perform logic (DB calls, hashing, uploads, etc.)
  • Send the final response using res

auth.controller.js

// IMPORT
import User from "../models/user.model.js";
import bcrypt from "bcrypt"
import { generateToken } from "../lib/utils.js";
import cloudinary from "../lib/cloudinary.js";
// END - IMPORT

// SIGNUP CONTROLLER
export const signup = async (req, res) => {
    // DESTRUCTRING REQUEST
    const { fullName, email, password } = req.body;
    // END - DESTRUCTRING REQUEST


    try {

        // DETAIL VERIFICATION
        if(!fullName || !email || !password){
            return res.status(400).json({
                message : "add everything"
            })
        }
        if (!password || password.length < 6) {
            return res.status(400).json({
                message: "password must be at least 6 characters"
            });
        }
        // END - DETAIL VERIFICATION

        // DETAIL IF ALREADY IS IN DATABASE 
        const user = await User.findOne({ email });
        if (user) {
            return res.status(400).json({ message: "User already exists" });
        }
        // END - DETAIL IF ALREADY IS IN DATABASE 

        // HASHING PASSWORD
        const salt = await bcrypt.genSalt(10);
        const hashPassword = await bcrypt.hash(password, salt);
        // END - HASHING PASSWORD

        // SAVE USER IN DATABASE
        const newUser = await User.create({ fullName : fullName, email : email, password : hashPassword });
        if (newUser) {
            generateToken(newUser._id, res)
            await newUser.save();
        }    
        // END - SAVE USER IN DATABASE

        // HAPPY RESPONSE
        return res.status(200).json(
            {
                _id : newUser._id,
                fullName : newUser.fullName,
                email : newUser.email,
                profilePic : newUser.profilePic
            }
        );
        // HAPPY RESPONSE

    } catch (error) {
        console.error(error);
        return res.status(500).json({ message: "Server error" });
    }
};
// END - SIGNUP CONTROLLER

// LOGIN CONTROLLER
export const login = async (req, res) => {
    // DESTRUCTURE REQUEST
    const {email, password} = req.body;
    // END - DESTRUCTURE REQUEST

    try {
        // FIND USER
        const user = await User.findOne({email});

        if (!user) {
            return res.status(400).json({
                message : "invalid credential"
            });
        }
        // END - FIND USER

        // VALIDATE PASSWORD
        const isPasswordCorrect = await bcrypt.compare(password, user.password);
        if (!isPasswordCorrect) {
            return res.status(400).json({
                message : "invalid credentials"
            })
        }
        // END - VALIDATE PASSWORD

        // GENERATE TOKEN AND RESPONCE
        generateToken(user._id, res);
        res.status(200).json({
            _id : user._id,
            fullName : user.fullName,
            email : user.email,
            profilePic : user.profilePic
        })
        // END - GENERATE TOKEN AND RESPONCE

    } catch (error) {
        console.log("error in login", error.message);
        res.status(500).json({
            message : "internal server error"
        })
    }
};
// END - LOGIN CONTROLLER

// LOGOUT CONTROLLER
export const logout = (req, res) => {
    try {
        // CLEAR COOKIE
        res.cookie("jwt", "", {
            maxAge : 0
        });
        res.status(200).json({message : "logged out successfully"})
        // END - CLEAR COOKIE

    } catch (error) {
        console.log("error in logout controller", error.message);
        res.status(500).json({
            message : "internal server error"
        })
    }
}
// END - LOGOUT CONTROLLER

// UPDATE CONTROLLER
export const updateProfile = async (req, res) => {
    try {
        // DESTRUCIRE PIC URL AND USERID FROM REQ
        const {profilePic} = req.body;
        const userId = req.user._id;

        if (!profilePic) {
            return res.status(400).json({
                message: "profile pic require"
            })
        }

        // END -DESTRUCIRE PIC URL AND USERID FROM REQ

        // UPLOAD PFP AND UPDATE USER
        const uploadResponse = await cloudinary.uploader.upload(profilePic);
        const updatedUser = await User.findByIdAndUpdate(
            userId,
            {profilePic : uploadResponse.secure_url},
            {new : true}
        )
        // END - UPLOAD PFP AND UPDATE USER

        res.status(200).json(updatedUser);
    } catch (error){
        res.status(500).json({message : "internal server error"})
    }
}
// END - UPDATE CONTROLLER

// CHECK AUTH CONTROLLER
export const checkAuth = (req, res) => {
    try {
        res.status(200).json(req.user);
    } catch (error) {
        res.status(500).json({
            message: "internal server errro"
        })
    }
}
// END - CHECK AUTH CONTROLLER
Enter fullscreen mode Exit fullscreen mode

This file handles authentication and user-related actions.

1. signup

Purpose:

Create a new user account, securely store credentials, and log the user in.

Step-by-step flow:

  1. Destructure request body

    const { fullName, email, password } = req.body;
    
    

    Extracts user input sent from the frontend.

  2. Validate input

-   Ensure all fields exist
-   Ensure password is at least 6 characters  
    If validation fails → return `400 Bad Request`
Enter fullscreen mode Exit fullscreen mode
  1. Check if user already exists

    const user = await User.findOne({ email });
    
    

    Prevents duplicate accounts with the same email.

  2. Hash the password

    const salt = await bcrypt.genSalt(10);
    const hashPassword = await bcrypt.hash(password, salt);
    
    

    Passwords are never stored in plain text.

  3. Create and save user

    const newUser = await User.create({...});
    
    

    Stores the user in MongoDB.

  4. Generate JWT token

    generateToken(newUser._id, res);
    
    

    Sets a secure cookie for authentication.

  5. Send success response

    Returns basic user info (never the password).

Flow diagram (signup):

Request
  ↓
Validate input
  ↓
Check existing user
  ↓
Hash password
  ↓
Create user in DB
  ↓
Generate JWT cookie
  ↓
Send response

Enter fullscreen mode Exit fullscreen mode
2. login

Purpose:

Authenticate an existing user and issue a JWT token.

Flow:

  1. Extract email and password
  2. Find user by email
  3. Compare hashed password using bcrypt.compare
  4. If valid:
    • Generate JWT
    • Send user data

Flow diagram (login):

Request
  ↓
Find user by email
  ↓
Compare password
  ↓
Generate JWT cookie
  ↓
Send response

Enter fullscreen mode Exit fullscreen mode
3. logout

Purpose:

Log the user out by clearing the authentication cookie.

What it does:

  • Overwrites the jwt cookie with an empty value
  • Sets maxAge: 0 to expire it immediately

Flow diagram (logout):

Request
  ↓
Clear JWT cookie
  ↓
Send success response

Enter fullscreen mode Exit fullscreen mode
4. updateProfile

Purpose:

Update the user’s profile picture.

Flow:

  1. Extract profilePic from request body
  2. Get user ID from req.user (set by auth middleware)
  3. Upload image to Cloudinary
  4. Update user document with image URL
  5. Return updated user

Flow diagram (update profile):

Request
  ↓
Validate image
  ↓
Upload to Cloudinary
  ↓
Update user in DB
  ↓
Send updated user

Enter fullscreen mode Exit fullscreen mode
5. checkAuth

Purpose:

Verify whether the user is authenticated.

What happens:

  • Auth middleware already validated the token
  • Controller simply returns req.user

Flow diagram (check auth):

Request
  ↓
Auth middleware
  ↓
Return user data

Enter fullscreen mode Exit fullscreen mode

message.controller.js

import cloudinary from "../lib/cloudinary.js";
import Message from "../models/message.model.js";
import User from "../models/user.model.js";

// RENDER USER ON SIDEBAR CONTROLLLER
export const getUsersForSidebar = async(req, res) => {
    try {
        // DESTRUCTURE USER FROM USERID
        const loggedInUserId = req.user._id;
        // END - DESTRUCTURE USER FROM USERID

        // FILTER USER
        const filteredUsers = await User.find({
            _id : {$ne : loggedInUserId}
        }).select("-password");
        // END - FILTER USER

        res.status(200).json(filteredUsers);
    } catch (error) {
        res.status(500).json({
            error : "interneal server error"
        })
    }

}
// END - RENDER USER ON SIDEBAR CONTROLLLER

// GET MESSAGE
export const getMessages = async (req, res) => {
    try {
        // EXTRACT ID FROM REQUEST
        const {id:userToChatId} = req.params
        const myId = req.user._id;
        // END - EXTRACT ID FROM REQUEST

        // SEARCH MESSAGES
        const messages = await Message.find({
            $or : [
                {senderId : myId, recieverId:userToChatId},
                {senderId : userToChatId, recieverId:myId}
            ]
        });
        // END - SEARCH MESSAGES


        res.status(200).json(messages);
    } catch (error) {
        res.status(500).json({
            error : "internal server error"
        })
    }
}
// END - GET MESSAGE

// SEND MESSAGE
export const sendMessage = async (req, res) => {
    try {
        // EXTRACT TEXT AND IMAGE
        const {text, image} = req.body;
        const {id : recieverId} = req.params;
        const senderId = req.user._id;
        // END - EXTRACT TEXT AND IMAGE

        // HANDLE IMAGE
        let imageUrl;
        if (image) {
            const uploadResponse = await cloudinary.uploader.upload(image);
            imageUrl = uploadResponse.secure_url;
        }
        // END - HANDLE IMAGE

        // CREATE MESSAGE AND SAVE MESSAGE
        const newMessage = new Message({
            senderId,
            recieverId,
            text,
            image : imageUrl
        });

        await newMessage.save();
        // END - CREATE MESSAGE AND SAVE MESSAGE

        res.status(201).json(newMessage)
    } catch (error) {
        res.status(500).json({
            error : "interna server error"
        })
    }
}
// END - SEND MESSAGE
Enter fullscreen mode Exit fullscreen mode

This file handles chat and messaging logic.

1. getUsersForSidebar

Purpose:

Fetch all users except the logged-in user.

Flow:

  1. Extract logged-in user ID from req.user
  2. Query DB excluding current user
  3. Remove passwords using .select("-password")
  4. Send user list

Flow diagram:

Request
  ↓
Get logged-in user ID
  ↓
Query users (exclude self)
  ↓
Send users list

Enter fullscreen mode Exit fullscreen mode
2. getMessages

Purpose:

Fetch all messages between two users.

Flow:

  1. Extract receiver ID from URL params
  2. Get sender ID from req.user
  3. Query messages where:
    • sender → receiver OR
    • receiver → sender
  4. Return messages

Flow diagram:

Request
  ↓
Extract sender & receiver IDs
  ↓
Query messages (both directions)
  ↓
Send messages

Enter fullscreen mode Exit fullscreen mode
3. sendMessage

Purpose:

Send a new message (text or image).

Flow:

  1. Extract text and image
  2. Upload image to Cloudinary (if present)
  3. Create new message document
  4. Save message in DB
  5. Return created message

Flow diagram:

Request
  ↓
Extract message data
  ↓
Upload image (optional)
  ↓
Create message
  ↓
Save to DB
  ↓
Send message

Enter fullscreen mode Exit fullscreen mode

middleware/

The middleware layer sits between the request and the controller.

It acts like a guard or checkpoint.

Before a controller runs:

  • Middleware executes first
  • If a condition fails → the request stops there
  • An appropriate response is returned
  • The controller never runs

If everything is valid, middleware calls next() and allows the request to continue.

auth.middleware.js

This middleware is responsible for route protection and authorization.

// IMPORTS
import jwt from "jsonwebtoken";
import User from "../models/user.model.js";
// END - IMPORTS

Enter fullscreen mode Exit fullscreen mode
  • jsonwebtoken is used to verify JWT tokens
  • User model is used to fetch the authenticated user

protectRoute

Purpose:

Ensure that only authenticated users can access protected routes.

1. Extract JWT from cookies

const token = req.cookies.jwt;

Enter fullscreen mode Exit fullscreen mode
  • The JWT token is stored inside cookies
  • If no token exists, the user is not logged in
if (!token) {
  return res.status(401).json({
    message: "unauthorized - no token provided"
  });
}

Enter fullscreen mode Exit fullscreen mode

2. Verify the token

const decoded = jwt.verify(token, process.env.JWT_SECRET);

Enter fullscreen mode Exit fullscreen mode
  • Checks if the token:
    • Was signed using the correct secret
    • Has not expired
    • Has not been tampered with

If verification fails, access is denied:

if (!decoded) {
  return res.status(401).json({
    message: "unauthorized - invalid token"
  });
}

Enter fullscreen mode Exit fullscreen mode

3. Fetch the user from the database

const user = await User.findById(decoded.userId).select("-password");

Enter fullscreen mode Exit fullscreen mode
  • Uses the userId stored in the token payload
  • Excludes the password for security

If the user no longer exists:

if (!user) {
  return res.status(404).json({
    message: "user not found"
  });
}

Enter fullscreen mode Exit fullscreen mode

4. Attach user to the request object

req.user = user;
next();
Enter fullscreen mode Exit fullscreen mode
  • Makes the authenticated user available to:
    • Controllers
    • Other middleware
  • Calls next() to continue execution

Flow diagram (protectRoute)

Request
  ↓
Check JWT cookie
  ↓
Verify token
  ↓
Find user in DB
  ↓
Attach user to req
  ↓
next() → Controller runs

Enter fullscreen mode Exit fullscreen mode
  • Keeps controllers clean
  • Centralizes authentication logic
  • Prevents duplicate checks across controllers
  • Makes protected routes easy to manage

.gitignore

I assume most of you already know Git and GitHub, but it’s still important to clearly understand this part.

  • Git is a version control system — it tracks changes in your code over time.

  • GitHub is a platform where we store and share our Git repositories.

Since GitHub repositories are usually public, there are certain files and folders that we should never push to GitHub.

Why .gitignore exists

A .gitignore file tells Git which files or folders to ignore when pushing code.

The two most important things you should never push are:

  1. node_modules/
    • Can be regenerated using npm install
    • Extremely large
  2. .env
    • Contains secrets like API keys, database URLs, and tokens

There are other files as well (logs, caches, OS files, build output), and managing them manually is annoying.

That’s why we usually:

  • Search “gitignore generator” in the browser
  • Select the stack (Node, npm, OS, editor, etc.)
  • Copy-paste the generated .gitignore

You don’t need to overthink it — this is standard boilerplate.

.gitignore

# Logs
logs
*.log
npm-debug.log*
yarn-debug.log*
yarn-error.log*

# Runtime data
pids
*.pid
*.seed

# Directory for instrumented libs generated by coverage tools
lib-cov

# Coverage directory
coverage
*.lcov

# nyc test coverage
.nyc_output

# Cache directories
.cache
.rpt2_cache/
.rts2_cache_cjs/
.rts2_cache_es/
.rts2_cache_umd/

# Dependency directories
node_modules/
jspm_packages/

# Optional npm cache directory
.npm

# dotenv environment variables file
.env
.env.test

# Build output directories
dist/
build/

# Optional REPL history
.node_repl_history

# IDE and OS files
.vscode/
.DS_Store

Enter fullscreen mode Exit fullscreen mode

Finally

With that being said, our project and fundamentals are solid, we can proceed further and build more projects and get deeper and also have understanding of a fairly solid project as a beginner

I am very happy to share this with you
I will post more backend dev related blogs

Written By - Shourya Sharma

Top comments (0)