Read/Write JSON Files with NodeJS
Written by Jon Church and Joe Shindelar. Originally published on Medium.
When you want to store data between server restarts with Node, JSON files are a simple and convenient choice. Whether you are reading a config file or persisting data for your application, Node has some built in utilities that make it easy to read and write JSON files.
Using JSON files in your app can be a useful way to persist data. We will look at a few different methods for working with JSON files.
In this tutorial we will:
- Read JSON data from disk
- Learn to use
fs
module to interact with the filesystem - Persist data to a JSON file
- Use
JSON.parse
andJSON.stringify
to convert data to and from JSON format
By the end of this tutorial you should be able to work with JSON files using Node’s built in fs
module.
Goal
Say you have a customer.json file saved to disk that holds a record for a customer in your store.
As part of your store app, you want to access the customer's address, and then update the order count after an order is placed.
In this tutorial, we are going to look at how to read and write to our customer.json file.
// customer.json
{
"name": "Mega Corp.",
"order_count": 83,
"address": "Infinity Loop Drive",
}
Work with files with fs
Accessing files in Node is done with the native module [fs](https://nodejs.org/api/fs.html)
, which gives you functions to watch, read, and write files along with many other tools to work with the filesystem. Because it's a native module, we can require it in our code without installing it. Just call const fs = require('fs')
.
The fs
module gives us the option of synchronous or asynchronous versions of many of its functions. The synchronous versions block execution of other code until they are done accessing the filesystem, reading, or writing data. An async function will run without blocking other code. You can learn more about sync/async behavior here.
This synchronous behavior can be useful in some places, like at startup when reading a config file before any other code is run, but becomes a big issue when used in a webserver where all incoming requests would be blocked while a synchronous file read is running. For this reason, you generally want to use the async versions of fs
functions in your code. We will focus on async operations, but will also show the synchronous equivalent.
To read and write files asynchronously with fs
we will use fs.readFile
and fs.writeFile
.
We also will use the global JSON
helper to convert objects to JSON strings, and JSON strings to objects.
Reading a JSON file
The simplest way to read a JSON file is to require it. Passing require
()
with the path to a JSON file will synchronously read and parse the data into a JavaScript object.
const config = require('./config.json')
But reading JSON files with require has its downsides. The file will only be read once; requiring it again returns the cached data from the first time require
was run. This is fine for loading static data on startup (like config data). But for reading a file that changes on disk, like our customer.json might, we need to manually read the file using the asynchronous fs.readFile
.
Reading a file with **fs.readFile**
To access the customer's address, we need to:
- read the JSON data from the file
- parse the JSON string into a JavaScript object
To load the data from customer.json file, we will use fs.readFile
, passing it the path to our file, an optional encoding type, and a callback to receive the file data.
If the file is successfully read, the contents will be passed to the callback.
const fs = require('fs')
fs.readFile('./customer.json', 'utf8', (err, jsonString) => {
if (err) {
console.log("File read failed:", err)
return
}
console.log('File data:', jsonString)
})
- './config.json' is the relative path to the the file
- 'utf8' is an optional parameter for the encoding of the file we are reading, this can be left out
-
(err, jsonString) => {}
is the callback function that runs after the file has been read
Now we have the contents of the file as a JSON string, but we need to turn the string into an object.
Before we can use the data from the callback in our code, we must turn it into an object. JSON.parse
takes JSON data as input and returns a new JavaScript object. Otherwise, we would just have a string of data with properties we can't access.
**JSON.parse**
can throw exception errors and crash our program if passed an invalid JSON string. To prevent crashing we wrap **JSON.parse**
in a **try catch**
statement to gracefully catch any errors.
This example shows reading and parsing a JSON file:
const fs = require('fs')
fs.readFile('./customer.json', 'utf8', (err, jsonString) => {
if (err) {
console.log("Error reading file from disk:", err)
return
}
try {
const customer = JSON.parse(jsonString)
console.log("Customer address is:", customer.address)
// => "Customer address is: Infinity Loop Drive"
} catch(err) {
console.log('Error parsing JSON string:', err)
}
})
Using the jsonString
from reading customer.json, we create an object, and can access the address property. If JSON.parse
throws an error, we handle it in the catch
block.
Now we have an object representation of the data in our customer.json file!
We can also read the file synchronously using fs.readFileSync
. Instead of taking a callback, readFileSync
returns the file content after reading the file.
try {
const jsonString = fs.readFileSync('./customer.json')
const customer = JSON.parse(jsonString)
} catch(err) {
console.log(err)
return
}
console.log(customer.address) // => "Infinity Loop Drive"
We can use this knowledge to create a reusable helper function to read and parse a JSON file.
Here we create a function called jsonReader
that will read and parse a JSON file for us. It takes the path to the file and a callback to receive the parsed object and any errors. It will catch any errors thrown by JSON.parse
for us.
const fs = require('fs')
function jsonReader(filePath, cb) {
fs.readFile(filePath, (err, fileData) => {
if (err) {
return cb && cb(err)
}
try {
const object = JSON.parse(fileData)
return cb && cb(null, object)
} catch(err) {
return cb && cb(err)
}
})
}
jsonReader('./customer.json', (err, customer) => {
if (err) {
console.log(err)
return
}
console.log(customer.address) // => "Infinity Loop Drive"
})
Writing to a file with fs.writeFile
Writing JSON to the filesystem is similar to reading it. We will use fs.writeFile
to asynchronously write data to a newCustomer.json file.
First, to write data to a JSON file, we must create a JSON string of the data with JSON.stringify
. This returns a JSON string representation of a JavaScript object, which can be written to a file. Similar to parsing data into an object when reading a file, we must turn our data into a string to be able to write it to a file.
So we create a customer object with our data below, and turn it into a string.
const customer = {
name: "Newbie Corp.",
order_count: 0,
address: "Po Box City",
}
const jsonString = JSON.stringify(customer)
console.log(jsonString)
// => "{"name":"Newbie Co.","address":"Po Box City","order_count":0}"
If you try to write an object to a file without stringifying it, your file will be empty and look like this:
[object, object]
Once the data is stringified, we can use fs.writeFile
to create a new customer file.
We pass fs.writeFile
the filepath, our customer data to write, and a callback that will be excecuted after the file is written. If the newCustomer.json file doesn't already exist, it will be created; if it does exist, it will be overwritten!
Here is an example of writing a JSON file with **fs.writeFile**
:
const fs = require('fs')
const customer = {
name: "Newbie Co.",
order_count: 0,
address: "Po Box City",
}
const jsonString = JSON.stringify(customer)
fs.writeFile('./newCustomer.json', jsonString, err => {
if (err) {
console.log('Error writing file', err)
} else {
console.log('Successfully wrote file')
}
})
And that's it! Once the callback runs, the file has been written to disk. Note: we are only passed an error object; the filedata we wrote isn't passed to the callback.
We can also write a file synchronously in the same way using fs.writeFileSync
:
const jsonString = JSON.stringify(customer)
fs.writeFileSync('./newCustomer.json', jsonString)
After your file is finished writing, it will look something like this:
{"name":"Newbie Co.","address":"Po Box City","order_count":0}
Stringifying by default puts your data all on a single line. Optionally, you can make the output file human-readable by passing the number of spaces to indent by to JSON.stringify
:
const jsonString = JSON.stringify(customer, null, 2)
Above, we told stringify to indent the data with 2 spaces.
Now your output file should look like this:
{
"name": "Newbie Co.",
"address": "Po Box City",
"order_count": 0
}
Updating JSON files
Now that we are able to read and write our customer files, we can use them as a simple kind of database. If we want to update the data in the JSON file, we can read the contents, change the data, and then write the new data back to the file:
jsonReader('./customer.json', (err, customer) => {
if (err) {
console.log('Error reading file:',err)
return
}
// increase customer order count by 1
customer.order_count += 1
fs.writeFile('./customer.json', JSON.stringify(customer), (err) => {
if (err) console.log('Error writing file:', err)
})
})
Definitely not the most efficient database you could choose, but working with JSON files like this is a simple way to persist data in your project.
Wrapping up
JSON is one of the most common types of data you'll work with in Node, and being able to read and write JSON files is very useful. You've learned how to use fs.readFile
and fs.writeFile
to asynchronously work with the filesystem, as well as how to parse data to and from JSON format, and catch errors from JSON.parse
.
You can use require
to read a JSON file at startup to synchronously parse a JSON file in one line. And now you can use a simple JSON file as a data store.
If you want to learn more, you can read up on what JSON actually is, and find out more about synchronous vs asynchronous code.
Lullabot Education is ramping up our Node.js training. Sign up for our newsletter and learn about upcoming guides and tutorials — and help shape the future of Node.js education.
Top comments (0)