The Problem That Frustrated Me
A few months ago, I was working on a backend application that needed to:
Accept CSV files from users (data import)
Export data to CSV (reports)
Sounds simple? In reality, I ended up installing three different packages:
bash
npm install papaparse json2csv csvtojson
And it was hell. Three different APIs, three different ways to handle errors, three different support communities. When I updated one library, another broke. When I optimized CSV parsing, I forgot about the memory leak in JSON export.
The worst part? Security.
None of these packages protected against CSV injection attacks by default. I discovered in 2023 that this is a real vulnerability:
text
=1+1
@sum(1+1)
+1+1
-1+1
These strings in a CSV file can open as formulas in Excel. You think it's impossible? Think again.
The Solution: JTCSV
I spent a month building one library that solves ALL these problems:
✅ JSON → CSV and CSV → JSON in one package
✅ CSV injection protection by default (no need to remember settings)
✅ NDJSON and TSV support (in case you need them)
✅ TypeScript-first (full types included)
✅ Streaming for large files (without OOM)
✅ 10 framework plugins (Express, Next.js, Fastify ready to use)
Quick Example
Before (two packages):
javascript
// Package 1 for CSV → JSON
import csv from 'csvtojson';
const jsonData = await csv().fromFile('data.csv');
// Package 2 for JSON → CSV
import json2csv from 'json2csv';
const { Parser } = json2csv;
const parser = new Parser();
const csv = parser.parse(jsonData);
Now (one package):
javascript
import { csvToJson, jsonToCsv } from 'jtcsv';
// CSV → JSON
const jsonData = await csvToJson(csvString);
// JSON → CSV
const csvString = jsonToCsv(jsonData);
Less code. Fewer dependencies. Fewer problems.
Security: What We Do by Default
javascript
// Without protection (other libraries):
{ formula: "=1+1" }
// → Result: "=1+1" (Excel opens as formula!)
// With JTCSV:
const safe = jsonToCsv({ formula: "=1+1" });
// → Result: "'=1+1" (prefix protects from Excel)
This is enabled by default. You don't need to remember settings.
Performance: We Don't Sacrifice Speed for Features
I ran benchmarks (1M records):
Operation JTCSV json2csv csv-parser
CSV → JSON 5.2s N/A 5.5s
JSON → CSV 3.9s 4.5s N/A
Memory peak 14 MB 28 MB 18 MB
We're faster on JSON → CSV and more efficient on memory. Because I spent time optimizing.
Framework Integration: Ready-Made Pieces
JTCSV has built-in plugins for popular frameworks. Examples:
Next.js:
javascript
import { nextjsHandler } from 'jtcsv/nextjs';
export const POST = nextjsHandler({
onUpload: async (data) => {
// data is already converted to JSON
return { success: true, rows: data.length };
}
});
Express:
javascript
import { expressMiddleware } from 'jtcsv/express';
app.post('/import', expressMiddleware({
maxRecords: 10000,
detectFormat: true
}), (req, res) => {
const data = req.body; // Already JSON!
res.json({ imported: data.length });
});
No boilerplate. Just use it.
Why I Open-Sourced It
I work at a company that imports 10M+ CSV records monthly. We were saving hours maintaining this framework. I thought: "This should be available to everyone."
JTCSV is on GitHub: https://github.com/Linol-Hamelton/jtcsv
⭐ Fully typed (TypeScript)
📚 Comprehensive documentation
🧪 16 test files (85%+ coverage)
🔒 Security policy (SECURITY.md)
🚀 CLI tool + TUI for terminal
What's Next?
If you have CSV/JSON workload, try JTCSV:
bash
npm install jtcsv
Or directly in browser:
xml
I'll answer all questions in the comments. If you find bugs — that's comments or GitHub issues too.
Thanks
Thanks to the Node.js community for inspiration. Thanks to PapaParse, csv-parser, and json2csv for proving this market needs a solution. JTCSV stands on the shoulders of giants.
Want to try it? → GitHub
Found a bug? → Issues
Questions? → Comments below!
⭐ If you liked it, give it a star on GitHub!
Top comments (0)