DEV Community πŸ‘©β€πŸ’»πŸ‘¨β€πŸ’»

David Okonji
David Okonji

Posted on

Generating and Downloading CSV files using express JS

While working on a node JS project, I had the need to pull certain data from my database and represent values in a CSV file, This guide aims at simplifying the process of creating a CSV file and making it downloadable from the client-side.

Prerequisites

  • Knowledge using express JS.
  • A working database setup.

For this tutorial, I would be making use of PostgreSQL with Sequelize ORM (database/ORM type does not actually matter), json2Csv npm package is my choice package for parsing JSON to CSV data because of its simplicity and ease of use. other npm packages which can be used are fast-csv, csv-parse.

Objective

Downloading users data (first name, last name, email) as a CSV file, using an API endpoint /download.

Steps

  • utility function
  • Using the helper function in the controller

Utility Function

Install the package using yarn add json2csv or npm i json2csv, once complete, we would make use of the parser synchronous API that is giving from json2Csv.

util.js

import { Parser } from 'json2csv';

export const downloadResource = (res, fileName, fields, data) => {
  const json2csv = new Parser({ fields });
  const csv = json2csv.parse(data);
  res.header('Content-Type', 'text/csv');
  res.attachment(fileName);
  return res.send(csv);
}

The downloadResource function parameters;

  • res: This is a response from the API endpoint.
  • fileName: The name intended for the generated CSV file to be downloaded.
  • fields: This is an array of objects that has a number of functionalities which includes, specifying what values should be parsed from data sent, giving a custom header to each column etc. Note: For this tutorial we are going to supply fields, but fields are optional, when defining the parser instance it can accept an empty parameter (i.e new Parser()). The keys (i.e label, value) of the objects would be used to give structure to the CSV file.
  • data: This is an array of objects representing data to be populated in the CSV file, Note: this is to match the specified field.

In the utility function, set the header Content-type to text/csv to enable the client to recognize what type of data is being sent and how to handle it, setting the attachment to the file name of choice and then finally returning a response using res.send taking the parsed CSV data as parameters. We have our ready to use utility function.

Using the helper function in the controller

Creating a new route for the download
route.js

 import express from 'express';
 import userController from './controllers/userController';

 const router = express.Router();

 router.get('/download', userController.download);

 // rest of express configs

Creating a new controller download method
userController.js

 import { users } from '../models';
 import { downloadResource } from '../util';

 const controller = {};

 controller.download = async (req, res) => {
  const fields = [
    {
      label: 'First Name',
      value: 'first_name'
    },
    {
      label: 'Last Name',
      value: 'last_name'
    },
    {
     label: 'Email Address',
      value: 'email_address'
    }
  ];
  const data = await users.findAll();

  return downloadResource(res, 'users.csv', fields, data);
 }

 export default controller;

Testing Implementation

Run your local server and open link in the client browser with a path for the download endpoint (e.g http://localhost:3000/api/download) and your file gets downloaded using your local download manager.

Conclusion

Fun fact this is my first article on dev.to πŸŽ‰, I am David Okonji a software engineer based in Lagos, Nigeria, feedback would be greatly appreciated. Gracias!

Top comments (17)

Collapse
 
spilex profile image
Alex Solonenko

Actually, I'm not a Front End dev, and my knowledge of js and node is quite a little, but I can help you to improve your solution.
Postgres has a very nice feature called "COPY FROM". It can copy any data from your database in the CSV format with the header as well.
So basically you don't need to have a json_to_converter and any orm at all.
Just using the "COPY TO STDOUT" and js streams you are able to build a very powerful and nice solution, which will help you to get any data from the database no matter how it will be large in size. you will be limited only with your bandwidth.

Collapse
 
davidokonji profile image
David Okonji

Thank you, Alex, Yeah sure an ORM is not necessary, I would surely try out the alternate solution suggested.

Collapse
 
patarapolw profile image
Pacharapol Withayasakpunt • Edited on

You should also streaming, especially if CSV is large.

Maybe by piping CSV stream.

Collapse
 
ndohjapan profile image
Joel Ndoh

Hello, I love the article. But I do have some questions.

In a situation where I gotten the data I need from mongodb and I applied .populate

However, I would like to get access to a certain field in the populated field. How do I do that.

Collapse
 
davidokonji profile image
David Okonji

Hello Joel, hope I’m not too late, if you use populate then you would have a nested object, the library supports dots notation, for example, you have a user object with Id key, it would be value: 'user.id'

Collapse
 
esealli profile image
Ese

This was very helpful David, Thanks. Just wondering how it will be handled in the frontend.

Collapse
 
davidokonji profile image
David Okonji

From the frontend, it can be sent as Formdata payload

Collapse
 
esealli profile image
Ese

Thank you!

Collapse
 
thisdotmedia_staff profile image
This Dot Media

This is awesome! Thanks David 😁

Collapse
 
awwal1999_72 profile image
Akanbi Lawal Tunde

Nice read!!

Collapse
 
donaldng profile image
Donald Ng

Thank you this is really helpful!

Collapse
 
cavdy profile image
Cavdy

This is beautiful... thanks

Collapse
 
keystroke99 profile image
keystroke99

Much needed one for me. Thanks :)

Collapse
 
luancoleto profile image
Luan Mota Coleto

Thanks, it was really helpful

Collapse
 
eliyahukriel profile image
eliyahuKriel

thanks!! very helpfull
how can i set the file to be support hebrew lang? when download the file its look like
%$#@$R@F

thanks

Collapse
 
davidokonji profile image
David Okonji

Hello, I am not quite sure, it depends on how the data is stored in the database.

An Animated Guide to Node.js Event Loop

Node.js doesn’t stop from running other operations because of Libuv, a C++ library responsible for the event loop and asynchronously handling tasks such as network requests, DNS resolution, file system operations, data encryption, etc.

What happens under the hood when Node.js works on tasks such as database queries? We will explore it by following this piece of code step by step.