DEV Community

Cover image for UNDERSTANDING STREAMING IN NODE Js
David-Micheal
David-Micheal

Posted on

UNDERSTANDING STREAMING IN NODE Js

DEFINITION

Streams are objects that allows developers to read/write data to and from a source in a continuous manner. There are four main types of streams in Node.js; readable, writable, duplex and transform. Each stream is an eventEmitter instance that emits different events at several intervals.

THE FOUR TYPES OF STREAM AND WHAT DO THE DIFFERENT STREAMS DO?

  1. The readable stream is a stream that is used for read operations.
  2. The writable stream as the name suggests is a stream used for writing operations.
  3. A duplex stream is a stream that performs both read and write operations.
  4. A transform stream is a stream that uses it input to compute an output. The streams throw several events since they are eventEmitter instances. These events are used to track and monitor the stream. Some of the most commonly used events are:
  5. Data - Data event is emitted when readable data is available.
  6. Finish - Finish event is emitted when the stream is done writing data.
  7. Error - Error event is emitted when an error occurs while reading/writing data.
  8. End - End event is emitted when the read stream has finished reading data.

If you have already worked with Node.js, you may have come across streams. For example, in a Node.js based HTTP server, request is a readable stream and response is a writable stream. You might have used the fs module, which lets you work with both readable and writable file streams. Whenever you’re using Express you are using streams to interact with the client, also, streams are being used in every database connection driver that you can work with, because of TCP sockets, TLS stack and other connections are all based on Node.js streams.

Reading from a stream:-

Handling streams is quite simple, all we have to do is create the stream and handle the events emitted by the stream.
async iterator
It’s highly recommended to use async iterator when working with streams. According to Dr. Axel Rauschmayer, Asynchronous iteration is a protocol for retrieving the contents of a data container asynchronously (meaning the current “task” may be paused before retrieving an item). Also, it’s important to mention that the stream async iterator implementation use the ‘readable’ event inside.You can use async iterator when reading from readable streams:

Two Reading Modes:-

According to Streams API, readable streams effectively operate in one of two modes: flowing and paused. A Readable stream can be in object mode or not, regardless of whether it is in flowing mode or paused mode.
• In flowing mode, data is read from the underlying system automatically and provided to an application as quickly as possible using events via the EventEmitter interface.
• In paused mode, the stream.read() method must be called explicitly to read chunks of data from the stream.

Writing to a stream:-

Now that we know how to read data from a stream, I will explain how we can write to one instead. The procedure is similar to the one of reading. The only difference is that this time we are supposed to create a write stream instead.
To write data to a writable stream you need to call write() on the stream instance. For Instance:

How to create a writable stream:-

To write data to a writable stream you need to call write() on the stream instance. Like in the following example:

`var fs = require('fs');
var readableStream = fs.createReadStream('file1.txt');
var writableStream = fs.createWriteStream('file2.txt');

readableStream.setEncoding('utf8');

readableStream.on('data', function(chunk) {
writableStream.write(chunk);
});`

The above code is straightforward. It simply reads chunks of data from an input stream and writes to the destination using write(). This function returns a boolean value indicating if the operation was successful. If true, then the write was successful and you can keep writing more data. If false is returned, it means something went wrong and you can’t write anything at the moment. The writable stream will let you know when you can start writing more data by emitting a drain event.
Calling the writable.end() method signals that no more data will be written to the Writable. If provided, the optional callback function is attached as a listener for the 'finish' event. Using a writable stream you can read data from a readable stream. You can also use async iterators to write to a writable stream, which is recommended.

Piping streams:-

Piping is a mechanism that involves using the output of another stream input of the other. The mechanism is quite simple. All we need to do is initialize the read and write stream, then use the pipe method of the read stream to pipe it to the write stream.
Piping is a mechanism where we provide the output of one stream as the input to another stream. It is normally used to get data from one stream and to pass the output of that stream to another stream. There is no limit on piping operations. In other words, piping is used to process streamed data in multiple steps.
The Stream Module
The Node.js stream module provides the foundation upon which all streaming APIs are build.
The Stream module is a native module that shipped by default in Node.js. The Stream is an instance of the EventEmitter class which handles events asynchronously in Node. Because of this, streams are inherently event-based. The stream module is useful for creating new types of stream instances. It is usually not necessary to use the stream module to consume streams.

Conclusion:-
In this article, you should be able to understand all about the basics of streams. Streams, pipes, and chaining are the core and most powerful features in Node.js. Streams can indeed help you write neat and performant code to perform I/O. Also, you are able to understand how to use streams to read and write data in files. We saw how to handle events emitted by streams. I hope this information was quite helpful. Thank you!!!!

Top comments (0)