JSON and CSV both represent tabular data, but they handle structure very differently. JSON can nest objects and arrays indefinitely. CSV is flat — two dimensions, rows and columns. That gap is where most conversion bugs live.
Here is a practical guide to converting JSON to CSV without losing data or your mind.
Why JSON to CSV Conversion Is Harder Than It Looks
A flat JSON array is easy. This converts in seconds:
[
{ "id": 1, "name": "Alice", "role": "admin" },
{ "id": 2, "name": "Bob", "role": "editor" }
]
id,name,role
1,Alice,admin
2,Bob,editor
The problem starts when your real-world JSON looks like this:
[
{
"id": 1,
"name": "Alice",
"address": {
"city": "London",
"zip": "EC1A"
},
"tags": ["admin", "billing"]
}
]
Now you have choices to make. Should address.city become a column? Should tags become a comma-separated string, or multiple columns, or separate rows?
The answer depends on what you are doing with the CSV.
Approach 1: Dot-Notation Flattening
The most common approach for nested objects is to flatten them with dot notation:
id,name,address.city,address.zip,tags
1,Alice,London,EC1A,"admin,billing"
Here is a minimal JavaScript function that does this:
function flattenObject(obj, prefix = '') {
return Object.entries(obj).reduce((acc, [key, val]) => {
const fullKey = prefix ? `${prefix}.${key}` : key;
if (val !== null && typeof val === 'object' && !Array.isArray(val)) {
Object.assign(acc, flattenObject(val, fullKey));
} else {
acc[fullKey] = Array.isArray(val) ? val.join(';') : val;
}
return acc;
}, {});
}
Then convert the array:
function jsonToCSV(data) {
const flattened = data.map(row => flattenObject(row));
const headers = [...new Set(flattened.flatMap(Object.keys))];
const rows = flattened.map(row =>
headers.map(h => {
const val = row[h] ?? '';
// Escape values that contain commas, quotes, or newlines
const str = String(val);
if (str.includes(',') || str.includes('"') || str.includes('\n')) {
return `"${str.replace(/"/g, '""')}"`;
}
return str;
}).join(',')
);
return [headers.join(','), ...rows].join('\n');
}
The key part people forget: proper CSV escaping. A value like "Hello, world" contains a comma. Without quotes, it splits into two columns. Without doubling internal quotes, it breaks the parser.
Approach 2: Stringify Mode
Sometimes you do not need to flatten nested data. You just need it in a spreadsheet for a quick review. In that case, stringify the complex values:
function flattenShallow(obj) {
return Object.fromEntries(
Object.entries(obj).map(([k, v]) => [
k,
typeof v === 'object' && v !== null ? JSON.stringify(v) : v
])
);
}
This keeps columns clean but stores {"city":"London","zip":"EC1A"} as a raw string in the cell. Readable, but you cannot sort or filter by nested properties.
Approach 3: Explode Arrays to Multiple Rows
If you have a one-to-many relationship — one user with multiple orders — you might want each array element as a separate row:
{ "userId": 1, "orders": [{"id": "A1"}, {"id": "A2"}] }
Becomes:
userId,orders.id
1,A1
1,A2
This produces more rows but is often the right shape for database imports.
Handling Edge Cases
Different objects with different keys. Real APIs return inconsistent shapes. Row 1 might have firstName, row 2 might have first_name. Your header extraction needs to union all keys, not just read the first row:
const headers = [...new Set(data.flatMap(Object.keys))];
Null and undefined values. Use ?? '' (nullish coalescing) rather than || '' — a value of 0 or false is falsy but valid and should not be replaced with an empty string.
Unicode and special characters. Emoji, accented characters, and right-to-left text all survive in CSV as long as you save the file as UTF-8. If your downstream tool opens it in Excel, Excel may interpret the encoding wrong. Adding a BOM () at the start of the file fixes this for most Excel users.
Number formats. A zip code like 00123 is a string, not the number 123. If you stringify everything, it survives. If you try to type-detect values, be careful — leading zeros mean the field is a string.
Triggering a File Download
Once you have the CSV string, downloading it from the browser is three lines:
function downloadCSV(csvString, filename = 'export.csv') {
const blob = new Blob([csvString], { type: 'text/csv;charset=utf-8;' });
const url = URL.createObjectURL(blob);
const a = document.createElement('a');
a.href = url;
a.download = filename;
a.click();
URL.revokeObjectURL(url);
}
No libraries needed. Works in every modern browser.
When to Use a Library
For production use with complex data, consider Papa Parse — it handles all edge cases and is the standard CSV library for JavaScript.
For quick one-off conversions — pasting an API response and getting a CSV for a spreadsheet — you do not need a library at all.
The SnappyTools JSON to CSV Converter handles nested objects with dot-notation flattening, stringify mode for complex values, live CSV preview, and file download. It runs entirely in your browser with nothing uploaded.
Summary
- Flat JSON arrays: trivial to convert
- Nested objects: flatten with dot notation or stringify
- Arrays of values: join with separator or explode to rows
- Always escape commas, quotes, and newlines in cell values
- Union all keys across all rows, not just the first row
- For browser downloads, use
Blob+URL.createObjectURL— no dependencies needed
Top comments (0)