<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Adarsh Kumar</title>
    <description>The latest articles on DEV Community by Adarsh Kumar (@adarsh12).</description>
    <link>https://dev.to/adarsh12</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/adarsh12"/>
    <language>en</language>
    <item>
      <title>Understanding Steams in Nodejs</title>
      <dc:creator>Adarsh Kumar</dc:creator>
      <pubDate>Tue, 03 Jan 2023 15:15:22 +0000</pubDate>
      <link>https://dev.to/adarsh12/understanding-steams-in-nodejs-4mj5</link>
      <guid>https://dev.to/adarsh12/understanding-steams-in-nodejs-4mj5</guid>
      <description>&lt;p&gt;&lt;strong&gt;NodeJs&lt;/strong&gt; is known for its asynchronous non-blocking behavior. As a server it comes up with various functionality to handle-data efficiently. Streams is one of those concepts offered by Nodejs that help us to deal with memory and data effectively. &lt;/p&gt;

&lt;p&gt;Let me give you an example, &lt;/p&gt;

&lt;p&gt;Suppose client has requested 2Gb of data from the Database (Let say video file). In this case, if whole data is send to the server at once it will consume 2Gb of space on the server. If there will be 100Gb of data it is not possible for server to handle huge amount of data.&lt;/p&gt;

&lt;p&gt;Here, concept of streams comes in. It enable the server to accept and send data part by part, this will save memory in the server and transfer data efficiently.&lt;/p&gt;

&lt;p&gt;In real-life we all have experienced streams mechanism. On Youtube, or any other video streaming platform we have seen &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkkl6h98jjsp6vnulwcqg.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkkl6h98jjsp6vnulwcqg.png" alt="Image description" width="800" height="457"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;As shown in the picture, the whole video is not loaded from the server at once, instead it loads the video chunk by chunk.&lt;/p&gt;

&lt;p&gt;The main advantage of using Streams are :&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Less Memory consumption.&lt;/li&gt;
&lt;li&gt;Time Efficient&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;There are 4 types of Streams provided by NodeJs:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Readable&lt;/li&gt;
&lt;li&gt;Writable&lt;/li&gt;
&lt;li&gt;Duplex&lt;/li&gt;
&lt;li&gt;Transform&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Let us discuss each of these types in depth,&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Readable&lt;/strong&gt; - stream from which data can be read. (fs.createReadStream()) provided by file system module allow us to read the contents of file.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Writable&lt;/strong&gt; - stream from which we can write data to the file.(fs.createWriteStream()) allow us to write in the file.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Duplex&lt;/strong&gt; - it creates both readable and writable stream at once.(net.Socket()) supports two way operations.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Transform&lt;/strong&gt; - It comprises of three steps :&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Creation of read stream&lt;/li&gt;
&lt;li&gt;Performing some operations to the data read from the file.(Transform)&lt;/li&gt;
&lt;li&gt;Write the transformed data into a writable stream.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnjjye2oul5oymexpcn7b.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnjjye2oul5oymexpcn7b.png" alt="Image description" width="800" height="272"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The above code example depicts how Readable and Writable Streams are created and used. Prerequisites to understand above example is to know about events and Event Emitter in NodeJs.&lt;br&gt;
&lt;a href="https://www.geeksforgeeks.org/node-js-eventemitter/" rel="noopener noreferrer"&gt;&lt;strong&gt;Here&lt;/strong&gt;&lt;/a&gt; &lt;br&gt;
You can check out this article for better understanding&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7fzq1q7b1d2uwfkzxupo.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7fzq1q7b1d2uwfkzxupo.png" alt="Image description" width="800" height="410"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This is how Duplex stream is implemented. &lt;strong&gt;PassThrough&lt;/strong&gt; is a basic Duplex stream that act as a tunnel to pipe our Readable and Writable Stream.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fuuayn03pxv8evh220vf2.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fuuayn03pxv8evh220vf2.png" alt="Image description" width="672" height="388"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;It shows a transform function that  will be implemented between read and write operations.&lt;/p&gt;

&lt;h2&gt;
  
  
  Bonus Topic - Pipeline
&lt;/h2&gt;

&lt;p&gt;pipeline is a mechanism that guides the flow of data. It basically gets data from one stream and puts it into another stream.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fu1r0i7nvw4ccwijzrek3.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fu1r0i7nvw4ccwijzrek3.png" alt="Image description" width="800" height="461"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Implementation of pipeline it take streams as arguments and lastly a callback function to catch the error.&lt;/p&gt;

&lt;p&gt;Thats all about NodeJs streams.... Thank You for reading this Blog.&lt;/p&gt;

</description>
      <category>bootstrap</category>
      <category>tailwindcss</category>
      <category>discuss</category>
    </item>
  </channel>
</rss>
