loading...

Async/Await with Nodejs File System

starpebble profile image starpebble ・2 min read

It's still not possible to simply use memory for all our computing. A single S3 object can grow up to 5TBs from a humble 1 byte.

fs Promises API

Please let me show you how to simply read a tiny file in /tmp named data.json.

async function f() {
  const fsPromises = require('fs').promises;
  const data = await fsPromises.readFile('/tmp/data.json')
                     .catch((err) => console.error('Failed to read file', err));

  return JSON.parse(data.toString());
}

The require isn't exactly like Webpack's code splitting with dynamic imports. I promise it's static. It just kind of looks similar.

readFile() returns a promise to await upon in an async function. The single fulfilled promise provides one Node.js Buffer. This is a limitation. The Buffer must fit into the lambda's memory limit. Keep a lambda safe and read small, in KBs.

thenable

f() is thenable. Try and figure out error handling yourself. This is just an example.

f()
 .then((d) => console.log(d))
 .catch((err) => console.error('f() failed', err));

readFile() is pretty simple.

File System (fs)

A lambda can possibly write a file to a file system. Our use of a file system space is safe.

The Node.js File System Promises API is very nice. It's an extremely common module. fs has an asynchronous API compatible with await. It's somewhat exciting that Node.js is compatible with cloud bursting. fs knows how to read and write.

Keep bursting! This is a sign I am comfortable holding up.

Posted on Jun 7 by:

starpebble profile

starpebble

@starpebble

Just trying to share and grow.

Discussion

markdown guide