I was asked to export a user database as a CSV for the finance team. Simple enough: JSON in, CSV out. The script took twenty minutes to write and three days to debug. The problems were not with the conversion logic -- they were with the data. Commas in address fields. Newlines in description fields. Nested objects that had no obvious flat representation. Unicode characters that Excel mangled on import. Every one of these is a known problem, and every one has a known solution.
What CSV actually is
CSV (Comma-Separated Values) is deceptively simple. Each line is a record. Each field is separated by a comma. The first line is optionally a header row.
name,email,city
Michael,mike@example.com,New York
Jane,jane@example.com,San Francisco
That is the entire format for simple data. The complexity comes from the escaping rules, which are defined in RFC 4180:
If a field contains a comma, newline, or double quote, the entire field must be enclosed in double quotes.
name,address,note
Michael,"123 Main St, Apt 4",Simple
Jane,"456 Oak Ave","She said ""hello"""
A double quote inside a quoted field is escaped by doubling it: "" represents a literal ". This is the rule that breaks most naive CSV implementations.
The flat problem
JSON is hierarchical. CSV is flat. This mismatch is the fundamental challenge of conversion.
{
"name": "Michael",
"address": {
"street": "123 Main St",
"city": "New York",
"zip": "10001"
},
"tags": ["developer", "writer"]
}
How do you represent this in CSV? There are several strategies:
Dot notation flattening. Nested objects become columns with dotted names:
name,address.street,address.city,address.zip,tags
Michael,123 Main St,New York,10001,"developer,writer"
This works well for one level of nesting. For deeply nested objects, the column names become unwieldy: company.headquarters.address.street.line1.
JSON-in-cell. Keep nested values as JSON strings:
name,address,tags
Michael,"{""street"":""123 Main St"",""city"":""New York""}","[""developer"",""writer""]"
This preserves the data but makes the CSV less useful for humans and for tools like Excel that expect scalar values.
Array explosion. Each array element gets its own row:
name,tag
Michael,developer
Michael,writer
This is the most relational approach and works well with SQL imports, but it duplicates the non-array fields.
There is no universally correct answer. The right approach depends on who is consuming the CSV and what they need to do with it.
Writing the conversion in JavaScript
For a flat array of objects:
function jsonToCSV(data) {
if (data.length === 0) return "";
const headers = Object.keys(data[0]);
const rows = data.map((obj) =>
headers.map((header) => escapeCSVField(obj[header])).join(",")
);
return [headers.join(","), ...rows].join("\n");
}
function escapeCSVField(value) {
if (value === null || value === undefined) return "";
const str = String(value);
if (str.includes(",") || str.includes('"') || str.includes("\n")) {
return '"' + str.replace(/"/g, '""') + '"';
}
return str;
}
This handles the basic cases. But the moment your data has nested objects or inconsistent keys across rows, you need more:
function flattenObject(obj, prefix = "") {
const result = {};
for (const [key, value] of Object.entries(obj)) {
const newKey = prefix ? `${prefix}.${key}` : key;
if (value && typeof value === "object" && !Array.isArray(value)) {
Object.assign(result, flattenObject(value, newKey));
} else if (Array.isArray(value)) {
result[newKey] = value.join("; ");
} else {
result[newKey] = value;
}
}
return result;
}
The Excel problem
Excel has its own interpretation of CSV that deviates from RFC 4180 in frustrating ways.
Encoding. Excel defaults to the system's locale encoding, not UTF-8. If your CSV contains non-ASCII characters and you open it in Excel by double-clicking, the characters will be garbled. The fix is to add a BOM (Byte Order Mark) at the beginning of the file: \uFEFF. Excel sees this and switches to UTF-8.
const bom = "\uFEFF";
const csv = bom + jsonToCSV(data);
Leading zeros. A field containing 00123 gets interpreted by Excel as the number 123, dropping the leading zeros. Zip codes, product codes, and phone numbers are common victims. The workaround is to prepend the field with an equals sign and wrap it in quotes: ="00123". This forces Excel to treat it as a formula that evaluates to the string 00123.
Long numbers. Numbers longer than 15 digits get truncated because Excel stores numbers in IEEE 754 double precision. A credit card number like 4111111111111111 becomes 4111111111111110. Again, the fix is to force string interpretation.
Date interpretation. A field containing 1-2 or 3/4 gets interpreted as a date. There is no clean CSV-side fix for this. It is an Excel behavior that has been reported as a bug for decades.
Common mistakes
Assuming all rows have the same keys. API responses often have optional fields. If row 1 has {name, email} and row 2 has {name, email, phone}, you need to scan all rows to collect the complete set of headers before writing any output.
Not handling null and undefined. JavaScript's String(null) produces "null" and String(undefined) produces "undefined", which are almost never what you want in a CSV. Map these to empty strings.
Using tab-separated values without saying so. If you use tabs instead of commas, the file is TSV, not CSV. Some systems accept both, some do not. Be explicit about the delimiter.
Forgetting the trailing newline. Some parsers expect a newline after the last row. Some do not. RFC 4180 says the last record "may or may not" have a trailing newline. For maximum compatibility, include it.
For quick conversions during development -- turning API responses into spreadsheet-friendly formats or generating test data from JSON fixtures -- I use a JSON to CSV converter at zovo.one/free-tools/json-to-csv-converter that handles nested objects, escaping, and Unicode correctly.
JSON to CSV is one of those tasks that sounds trivial and is not. Handle the escaping. Handle the nesting. Handle the encoding. Test with Excel. Then test again with data that has commas in it.
I'm Michael Lip. I build free developer tools at zovo.one. 350+ tools, all private, all free.
Top comments (0)