DEV Community

David Okonji
David Okonji

Posted on

Generating and Downloading CSV files using express JS

While working on a node JS project, I had the need to pull certain data from my database and represent values in a CSV file, This guide aims at simplifying the process of creating a CSV file and making it downloadable from the client-side.


  • Knowledge using express JS.
  • A working database setup.

For this tutorial, I would be making use of PostgreSQL with Sequelize ORM (database/ORM type does not actually matter), json2Csv npm package is my choice package for parsing JSON to CSV data because of its simplicity and ease of use. other npm packages which can be used are fast-csv, csv-parse.


Downloading users data (first name, last name, email) as a CSV file, using an API endpoint /download.


  • utility function
  • Using the helper function in the controller

Utility Function

Install the package using yarn add json2csv or npm i json2csv, once complete, we would make use of the parser synchronous API that is giving from json2Csv.


import { Parser } from 'json2csv';

export const downloadResource = (res, fileName, fields, data) => {
  const json2csv = new Parser({ fields });
  const csv = json2csv.parse(data);
  res.header('Content-Type', 'text/csv');
  return res.send(csv);
Enter fullscreen mode Exit fullscreen mode

The downloadResource function parameters;

  • res: This is a response from the API endpoint.
  • fileName: The name intended for the generated CSV file to be downloaded.
  • fields: This is an array of objects that has a number of functionalities which includes, specifying what values should be parsed from data sent, giving a custom header to each column etc. Note: For this tutorial we are going to supply fields, but fields are optional, when defining the parser instance it can accept an empty parameter (i.e new Parser()). The keys (i.e label, value) of the objects would be used to give structure to the CSV file.
  • data: This is an array of objects representing data to be populated in the CSV file, Note: this is to match the specified field.

In the utility function, set the header Content-type to text/csv to enable the client to recognize what type of data is being sent and how to handle it, setting the attachment to the file name of choice and then finally returning a response using res.send taking the parsed CSV data as parameters. We have our ready to use utility function.

Using the helper function in the controller

Creating a new route for the download

 import express from 'express';
 import userController from './controllers/userController';

 const router = express.Router();


 // rest of express configs
Enter fullscreen mode Exit fullscreen mode

Creating a new controller download method

 import { users } from '../models';
 import { downloadResource } from '../util';

 const controller = {}; = async (req, res) => {
  const fields = [
      label: 'First Name',
      value: 'first_name'
      label: 'Last Name',
      value: 'last_name'
     label: 'Email Address',
      value: 'email_address'
  const data = await users.findAll();

  return downloadResource(res, 'users.csv', fields, data);

 export default controller;
Enter fullscreen mode Exit fullscreen mode

Testing Implementation

Run your local server and open link in the client browser with a path for the download endpoint (e.g http://localhost:3000/api/download) and your file gets downloaded using your local download manager.


Fun fact this is my first article on šŸŽ‰, I am David Okonji a software engineer based in Lagos, Nigeria, feedback would be greatly appreciated. Gracias!

Top comments (16)

spilex profile image
Alex Solonenko

Actually, I'm not a Front End dev, and my knowledge of js and node is quite a little, but I can help you to improve your solution.
Postgres has a very nice feature called "COPY FROM". It can copy any data from your database in the CSV format with the header as well.
So basically you don't need to have a json_to_converter and any orm at all.
Just using the "COPY TO STDOUT" and js streams you are able to build a very powerful and nice solution, which will help you to get any data from the database no matter how it will be large in size. you will be limited only with your bandwidth.

davidokonji profile image
David Okonji

Thank you, Alex, Yeah sure an ORM is not necessary, I would surely try out the alternate solution suggested.

patarapolw profile image
Pacharapol Withayasakpunt • Edited

You should also streaming, especially if CSV is large.

Maybe by piping CSV stream.

ndohjapan profile image
Joel Ndoh

Hello, I love the article. But I do have some questions.

In a situation where I gotten the data I need from mongodb and I applied .populate

However, I would like to get access to a certain field in the populated field. How do I do that.

davidokonji profile image
David Okonji

Hello Joel, hope Iā€™m not too late, if you use populate then you would have a nested object, the library supports dots notation, for example, you have a user object with Id key, it would be value: ''

esealli profile image

This was very helpful David, Thanks. Just wondering how it will be handled in the frontend.

davidokonji profile image
David Okonji

From the frontend, it can be sent as Formdata payload

esealli profile image

Thank you!

thisdotmedia_staff profile image
This Dot Media

This is awesome! Thanks David šŸ˜

awwal1999_72 profile image
Akanbi Lawal Tunde

Nice read!!

donaldng profile image
Donald Ng

Thank you this is really helpful!

cavdy profile image

This is beautiful... thanks

keystroke99 profile image

Much needed one for me. Thanks :)

luancoleto profile image
Luan Mota Coleto

Thanks, it was really helpful

eliyahukriel profile image

thanks!! very helpfull
how can i set the file to be support hebrew lang? when download the file its look like


davidokonji profile image
David Okonji

Hello, I am not quite sure, it depends on how the data is stored in the database.