The csv-parse is a parsing package that interprets CSV input into array or object. It uses Nodejs stream API under the hood but has been optimised for easy of use and parsing large datasets.
Usage
To get started run the following command to install the package in your existing or new project.
Install the package
npm i csv-parse
Example CSV (data.csv)
name,age,email
Alex,33,alex@example.com
Bekky,20,bekky@example.com
Carl,27,carl@example.com
Read and Parse data.csv
import fs from 'node:fs';
import {parse} from 'csv-parse';
const records =[];
fs.createReadStream('path/data.csv')
.pipe(
parse{
columns: true, //use first row as header
skip_empty_lines: true,
}
)
.on('data',(row)=>{
records.push(row);
})
.on('error',(err)=>{
console.error(err.message);
})
.on('end', ()=>{
console.log(records);
});
Output
[
{ name: 'Alex', age: '33', email: 'alex@example.com' },
{ name: 'Bekky', age: '20', email: 'bekky@example.com' },
{ name: 'Carl', age: '27', email: 'carl@example.com' }
]
Here's what you should always remember,
The parse API parse('input',options,callback) accepts an input which, it then interprets into structured data using the specified parsing rules passed as the options argument.
input : string or buffer
options: parsing rule (optional). e.g {columns:true/false,delimiter:","}
callback(err, records)
It is recommended you use fs.createReadStream() to stream your file piece by piece, instead of loading everything once into memory when using fs.readFile().
Stream tells node to read chunks of the file and each chunk is parsed imediately which keeps memory usage low. You can use fs.readFile() for file that is ≤ 5MB.
Top comments (0)