DEV Community

Cover image for Parallel programming in NodeJs with Piscina
Tobias Nickel
Tobias Nickel

Posted on β€’ Edited on β€’ Originally published at tnickel.de

4 4

Parallel programming in NodeJs with Piscina

Developing and comparing my fast XML parser with other libraries, I came across a new project for using the worker process very easy and effective.

Piscina is created by some developers of node.js. And it has absolutely surprised me.

Also: recently I had an argument with a colleague, that some task was so much faster after implementing it in golang and not in javascript. I told him that we also could use workers and inter-process communication (IPC) to be much faster using node.js. But implementing was too complex and out of scope for the time being.

Now, with piscina implementing functionalities in a worker process became a piece of a cake and the process now runs just as fast directly in JS. Because the real limitation lies not on the CPU but on networking.

Using Piscina

First, we need a worker.js file. It exports a single function. That you want to execute in a separate thread.

const sleep = ms => new Promise(resolve => setTimeout(resolve, ms));

module.exports = async ({ a, b } => {
  await sleep(100);
  return a + b;
});
Enter fullscreen mode Exit fullscreen mode

Of course, you will not sleep, but do some processing work and return that result. Piscina is made for bettter usage of CPU. In tasks that a single node.js process can process concurrently, such as db queries and API calls, you don't need this module.

The CPU processing task could include image processing, encryption, and decryption, or parsing data. The process can be implemented with sync or async function.
The result can be returned back to the main process or let's say uploaded into the cloud. Whatever is needed.

In the main process, to use the worker module, you do the following:

const Piscina = require('piscina');

const workerPool = new Piscina({
  filename: __dirname + '/worker.js'
});

(async function() {
  const result = await workerPool.runTask({ a: 4, b: 6 });
  console.log(result);  // Prints 10
})();
Enter fullscreen mode Exit fullscreen mode

This is basically it. The worker pool can be renamed, .runTask can be called let's say in an API handler of express or graphql. The Argument has to be a single object. But it can have any number of props and dept.

For configuration, you can pass more options into the Piscina constructor. And the options did not net me down. They let you pick the number of threats, behavior for pooling to save some memory, limit the worker memory, and processing time. Really everything I could think of, to be done differently by the library, had a reasonable configuration available.

Future

I think this will open up many options to make processes and performance in node.js applications better.

For the txml xmp parser however, I decided not to integrate the module, because when used by the application developer, even more CPU heavy processing of the data can be moved from the main thread into the worker.

What do you think of piscina? Do you have an Idea what you can use it for?

Sentry blog image

How I fixed 20 seconds of lag for every user in just 20 minutes.

Our AI agent was running 10-20 seconds slower than it should, impacting both our own developers and our early adopters. See how I used Sentry Profiling to fix it in record time.

Read more

Top comments (1)

Collapse
 
starpebble profile image
starpebble β€’

Thanks for putting light on Piscina. Please kindly allow me to think out loud: anything that frees up the nodejs event loop to respond to interaction is for the better. πŸ”§

nextjs tutorial video

πŸ“Ί Youtube Tutorial Series

So you built a Next.js app, but you need a clear view of the entire operation flow to be able to identify performance bottlenecks before you launch. But how do you get started? Get the essentials on tracing for Next.js from @nikolovlazar in this video series πŸ‘€

Watch the Youtube series

πŸ‘‹ Kindness is contagious

Please leave a ❀️ or a friendly comment on this post if you found it helpful!

Okay