Node.js offers some powerful primitives when it comes to building HTTP servers. By default, you get a function that runs every time an HTTP request has been received by the server. The proverbial server example that parses an incoming POST request containing a JSON body looks a bit like this:
const http = require('http');
const server = http.createServer((req, res) => {
// This function is called once the headers have been received
res.setHeader('Content-Type', 'application/json');
if (req.method !== 'POST' || req.url !== '/user') {
res.statusCode = 405;
res.end('{"error":"METHOD_NOT_ALLOWED"}');
return;
}
let body = '';
req.on('data', (data) => {
// This function is called as chunks of body are received
body += data;
});
req.on('end', () => {
// This function is called once the body has been fully received
let parsed;
try {
parsed = JSON.parse(body);
} catch (e) {
res.statusCode = 400;
res.end('{"error":"CANNOT_PARSE"}');
}
res.end(JSON.stringify({
error: false,
username: parsed.username
}));
});
});
server.listen(3000, () => {
console.log('Server running at http://localhost:3000/');
});
By default, Node.js allows us to run a function whenever any request is received. There is no built-in router based on paths. Node.js does perform some basic parsing — for example, parsing the incoming HTTP message and extracting different components like the path, header pairs, encoding (Gzip and SSL), etc.
However, the need for higher-level functionality means that we usually have to reach for a web framework. For example, if a multipart/form-data
or application/x-www-form-urlencoded
request is received, we need to use a module to handle decoding the content for us. If we want to simply route requests based on pattern matching and HTTP methods, we’ll need either a module — or, often, a full web framework — to handle this for us.
That’s where tools like Express.js come into play.
Meet Express.js
Express.js fairly early became the go-to framework for building web applications using Node.js. It scratched an itch that many developers had: it provided a nice syntax for routing HTTP requests, it provided a standardized interface for building out middleware, and it did so using the familiar callback pattern embraced by the core Node.js APIs and most of the npm ecosystem.
Express.js became so popular that it’s almost ubiquitously associated with Node.js — much like when we read about the language Ruby, we’re already conjuring up thoughts of the framework Rails. In fact, Express.js and Node.js are members of the popular MEAN and MERN stack acronyms.
Let’s take a look at what our previous example might look like when we bring Express.js into the picture:
const express = require('express');
const app = express();
app.post('/user', (req, res) => {
// This function is called once the headers have been received
let body = '';
req.on('data', (data) => {
// This function is called as chunks of body are received
body += data;
});
req.on('end', () => {
// This function is called once the body has been fully received
let parsed;
try {
parsed = JSON.parse(body);
} catch (e) {
res.statusCode = 400;
res.json({
error: 'CANNOT_PARSE'
});
}
res.json({
error: false,
username: parsed.username
});
});
});
app.listen(3000, () => {
console.log('Server running at http://localhost:3000/');
});
In this example, we see that things get a little nicer. We’re able to specifically state the method and path we want to match by using app.post('/user')
. This is much simpler than writing a big branching statement within the handler.
We’re also given some other niceties. Consider the res.json({})
method: this not only serializes an object into its JSON equivalent, but it also sets the appropriate Content-Type
header for us!
However, Express.js still gives us the same paradigm that we get when using the built-in http
module; we’re still calling methods on req
and res
objects, for example.
An ideal example
Let’s take a step back and look at what an ideal example of an HTTP server might look like. Routing is desirable, and Express.js has a powerful routing syntax (it supports dynamic routing patterns, for instance). However, the code that runs within the controller function is where we really want to clean things up.
In the above example, we’re doing a lot of work with asynchronous code. The request object is an Event Emitter that emits two events we care about, namely data
and end
. But, really, we often just want the ability to convert an HTTP request into a JSON object that we can easily extract values from.
Also, we’re given both a request (req
) and a response (res
) object. The req
object makes sense — it contains information about the request we’re receiving. But does the res
really make all that much sense? We only want to provide a result from our controller function as a reply.
With synchronous functions, it’s simple to receive a result from a function call: just return the value. We can do the same thing if we make use of async
functions. By returning a call to an async
function, the controller function can resolve a value that ultimately represents the response we intend for the consumer to receive.
Let’s look at an example of this:
const server = someCoolFramework();
server.post('/user', async (req) => {
let parsed;
try {
parsed = await req.requestBodyJson();
} catch (e) {
return [400, {
error: 'CANNOT_PARSE'
}];
}
return {
error: false,
username: parsed.username
};
});
server.listen(3000, () => {
console.log('Server running at http://localhost:3000/');
});
There are a few concepts going on in this idealized example of ours. First, we’re maintaining the existing router syntax used by Express.js because it’s pretty solid. Second, our req
object provides a helper for converting an incoming request into JSON.
The third feature is that we’re able to provide a representation of the response by simply returning a result. Since JavaScript doesn’t support tuples, we’re essentially recreating one by using an array. So with this fictional example, a returned string could be sent directly to the client as a body, a returned array can be used to represent the status code and the body (and perhaps a third parameter for metadata like headers), and a returned object can be converted into its JSON representation.
Adapting Express.js
Now, it actually is possible to recreate some of this behavior with Express.js using a set of middleware.
The express-async-handler
npm module provides a wrapper function that can interpose and allow an async
controller function to interact nicely with the Express.js app.use
API. Unfortunately, this requires the developer to manually wrap each controller function:
const asyncHandler = require('express-async-handler')
app.post('/user', asyncHandler(async (req, res, next) => {
const bar = await foo.findAll();
res.send(bar);
}))
The response tuple unwrapping can also be handled by middleware. Such a middleware would need to run after the controller code has run and would replace the array with a representation Express.js is expecting.
The ability to promisify the request body stream parsing can also be built in a generic manner:
app.use((req, res, next) => {
req.bodyToJson = requestBodyJson(req);
next();
});
function requestBodyJson(req) {
return new Promise((resolve, reject) => {
let body = '';
req.on('data', (data) => {
// This function is called as chunks of body are received
body += data;
});
req.on('end', () => {
// This function is called once the body has been fully received
let parsed;
try {
parsed = JSON.parse(body);
} catch (e) {
reject(e);
return;
}
resolve(parsed);
});
});
}
With the above code, we can then await the parsing using Express.js (and really any other situation where we’re given an instance of an HTTP Request
object):
// When using the Express.js middleware:
const parsed = await req.bodyToJson();
// Using the function generically:
const parsed = await requestBodyJson(req);
Using another framework
It is true that we can reproduce some of these desired patterns using Express.js, but there are frameworks that have been built from the ground up with support for promises and the async/await paradigm. Let’s see what our example controller might look like when written using different web server frameworks.
Fastify
Fastify, as its name implies, was built with the intention of being a very fast Node.js web framework. Despite its main goal of speed, it actually does a very nice job of achieving our ideal controller syntax.
This example is so terse that it almost feels like cheating:
const fastify = require('fastify');
const app = fastify();
app.post('/user', async (req, reply) => {
return {
error: false,
username: req.body.username
};
});
app.listen(3000).then(() => {
console.log('Server running at http://localhost:3000/');
});
Fastify not only supports async
functions for use as controller code, but it also automatically parses incoming requests into JSON if the Content-Type
header suggests the body is JSON. This is why the example code ends up being so tiny.
This also means that we can rely on Fastify to respond with a sane error when parsing fails. For example, when the client sends invalid JSON to Fastify, the response will look something like this:
{
"statusCode": 400,
"error": "Bad Request",
"message": "Unexpected string in JSON at position 19"
}
Koa
Koa is a sort of spiritual successor to Express.js, having been written by some of the original Express.js authors. It does support async
functions out the door, but it doesn’t come with a router of its own. We can make use of koa-router
to provide routing.
Here’s what our example controller might look like with Koa:
const Koa = require('koa');
const Router = require('koa-router');
const app = new Koa();
const router = new Router();
router.post('/user', async (ctx) => {
try {
const parsed = await requestBodyJson(ctx.req);
ctx.body = {
error: false,
username: parsed.username
};
} catch (e) {
ctx.status = 400;
ctx.body = {
error: 'CANNOT_PARSE'
};
}
});
app.use(router.routes());
app.listen(3000);
This Koa example isn’t as succinct as the Fastify version. It doesn’t perform the automatic JSON parsing, but we’re able to reuse the requestBodyJson()
method we created earlier. It also doesn’t use the returned/resolved value from our controller but instead works by consuming data attached to the ctx
argument.
Takeaways
When Node.js was still in its infancy, Express.js became the obvious choice for building web applications. Express.js had the goal of being a convenient web server that followed the callback paradigm. It achieved that goal, and the product is now essentially complete.
However, as the JavaScript ecosystem has matured, we’ve gained new language tools and syntax. Dozens if not hundreds of frameworks have arisen since then, many of which have embraced these new language features.
If you find yourself working on a new project written in Node.js that acts as a web server, I encourage you to consider newer contenders such as Koa and Fastify instead of defaulting to the familiar Express.js.
Plug: LogRocket, a DVR for web apps
LogRocket is a frontend logging tool that lets you replay problems as if they happened in your own browser. Instead of guessing why errors happen, or asking users for screenshots and log dumps, LogRocket lets you replay the session to quickly understand what went wrong. It works perfectly with any app, regardless of framework, and has plugins to log additional context from Redux, Vuex, and @ngrx/store.
In addition to logging Redux actions and state, LogRocket records console logs, JavaScript errors, stacktraces, network requests/responses with headers + bodies, browser metadata, and custom logs. It also instruments the DOM to record the HTML and CSS on the page, recreating pixel-perfect videos of even the most complex single-page apps.
Try it for free.
The post Forget Express.js — opt for these alternatives instead appeared first on LogRocket Blog.
Top comments (1)
Great article Brian, I think you missed a really good framework Feathers.js. I'd love to hear your thoughts on them and how they stack up against these others.