<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Ali Amjad </title>
    <description>The latest articles on DEV Community by Ali Amjad  (@ali_a_koye).</description>
    <link>https://dev.to/ali_a_koye</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/ali_a_koye"/>
    <language>en</language>
    <item>
      <title>A journey of creating a package on NPM ( Fast File Converter )</title>
      <dc:creator>Ali Amjad </dc:creator>
      <pubDate>Mon, 29 Aug 2022 19:11:28 +0000</pubDate>
      <link>https://dev.to/ali_a_koye/a-journey-of-creating-a-package-on-npm-fast-file-converter--5ekp</link>
      <guid>https://dev.to/ali_a_koye/a-journey-of-creating-a-package-on-npm-fast-file-converter--5ekp</guid>
      <description>&lt;h2&gt;
  
  
  introduction
&lt;/h2&gt;

&lt;p&gt;in this article, I would love to discuss the journey I had when building a package and publishing it npm ( node package manager ).&lt;/p&gt;

&lt;p&gt;Yesterday, I published the Fast File Converter Package and it's an achievement that I can give some of what I have learned to the community as I've got many from there ( all of us do ).&lt;/p&gt;

&lt;p&gt;We will be more focused on the codebase in this article along with tips and notes that may help you when you create your next package.&lt;/p&gt;




&lt;h2&gt;
  
  
  Pdf Excel Generator VS Fast File Converter
&lt;/h2&gt;

&lt;p&gt;Last year, I have published a package under the name of "Pdf Excel Generator" and you can find the &lt;a href="https://github.com/Ali-A-Koye/pdf-excel-generator"&gt;archived GitHub repository&lt;/a&gt;. As of today, the &lt;a href="https://www.npmjs.com/package/pdf-excel-generator"&gt;package has been deprecated on NPM&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;Fast File Converter is an upgrade of Pdf Excel Generator, there were limitations with the PEG :&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;it was written in plain JS.&lt;/li&gt;
&lt;li&gt;there was a lot of bad code in it and bad architecture.&lt;/li&gt;
&lt;li&gt;it was supporting 2 types only ( PDF &amp;amp; Excel )&lt;/li&gt;
&lt;li&gt;there was no room to grow.&lt;/li&gt;
&lt;li&gt;bad input validations&lt;/li&gt;
&lt;li&gt;inefficient documentation.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;However, with the FFC, I have opened a door with more possibilities:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;It's written with TS&lt;/li&gt;
&lt;li&gt;easier to achieve great source code with TS with good practices and types&lt;/li&gt;
&lt;li&gt;now it's supporting 7 diffrent types.&lt;/li&gt;
&lt;li&gt;the code &amp;amp; the architecture that is written has a lot of room to grow and scale up in the future.&lt;/li&gt;
&lt;li&gt;with TS deceleration files, you can enjoy input validations from TS.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Overall, that was a huge boost and improvement from the previous package to this one. &lt;/p&gt;




&lt;h2&gt;
  
  
  Fast File Converter(FFC) codebase architecture
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--A9oq2s1g--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/oc2bsp2szkx3mcy6n0io.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--A9oq2s1g--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/oc2bsp2szkx3mcy6n0io.png" alt="Image description" width="606" height="603"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The Above image is the folder architecture of FFC, it's open source under an MIT license and &lt;a href="https://github.com/Ali-A-Koye/fast-file"&gt;you can find it on GitHub&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Demo Folder :&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;if you want to create a package, you must provide some examples for developers to understand to use your package.&lt;/p&gt;

&lt;p&gt;I've created a &lt;a href="https://github.com/Ali-A-Koye/fast-file/tree/master/demo"&gt;Demo Folder&lt;/a&gt; and I have added some examples in it that are suitable for the usage of my package, by "suitable" I mean that the examples can change according to your package, in my case that I have built an express application that runs a server to use this package and download it when you hit the example's routes.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Dist Folder :&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;As some of you may don't know what a dist folder is, &lt;a href="https://github.com/Ali-A-Koye/fast-file/tree/master/dist"&gt;the dist or destination folder&lt;/a&gt; is a directory that I have specified for the Typescript builds to be there, so basically it's the build of the Project folder.&lt;/p&gt;

&lt;p&gt;however it may seem weird that I have committed this folder, usually, it's not recommended to push your build into GitHub but : &lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;if your package is running on the browser ( client side ), you must have it there so that only JS runs on the browser &lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;my Demo folder is not importing the package from the NPM but rather from the dist folder, so it can be used as a development tool as well to see your changes right away with the demos.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;Lib Folder :&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://github.com/Ali-A-Koye/fast-file/tree/master/lib"&gt;Lib or Library folder&lt;/a&gt; is the directory where all of our core TS files of the package's functionality lie in there.&lt;/p&gt;

&lt;p&gt;and this can be the most important folder because usually you only code in this folder to change the source code of the package and make changes to it.&lt;/p&gt;

&lt;p&gt;and at the end, TS files in this directory will be compiled to JS to be ready for production ready.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Script Folder :&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;we cannot do everything with our TS code, at some point there will be limitations to achieving what you want to achieve.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://github.com/Ali-A-Koye/fast-file/tree/master/script"&gt;The scripts folder&lt;/a&gt; is for my bash files, bash files are plain text files that contain a series of commands to interact with the OS usually.&lt;/p&gt;

&lt;p&gt;you may ask what i have been using Bashs in this project, I only have 2 bash scripts that &lt;/p&gt;

&lt;p&gt;The first one is to move or copy my assets to the build folder when a build happens&lt;br&gt;
The second one is to move or copy the Type folder to the build folder when a build happens so that file declarations work properly.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Test Folder&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Tests are essential and important for any project regardless of what type of test you are writing.&lt;/p&gt;

&lt;p&gt;even though I have created a package and unit tests make more sense but I have thought that integration tests will be more beneficial for my case as I need to respond back to the user.&lt;/p&gt;

&lt;p&gt;So I have written Integration tests inside the &lt;a href="https://github.com/Ali-A-Koye/fast-file/tree/master/test"&gt;Test Folder&lt;/a&gt; to test the Demo Routes and make sure that all of our supported types work properly without any errors.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Types Folder :&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://github.com/Ali-A-Koye/fast-file/tree/master/types"&gt;The types folder&lt;/a&gt; is a folder where my custom types lie, any TS project should have this so that your Types are not mixed with the Application logic because I can tell you, types with the code will be an easy mess very quickly.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;The Rest of the Architecture&lt;/strong&gt;&lt;br&gt;
not much left to discuss, the util folder is for utility functions but we don't have any in this project to mention.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;.npmignore is a file to specify which part of your project can be downloaded when someone does npm install, it's a critical file because you don't want your users to have the whole project in their node modules.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;LICENSE is a file to specify which License you are using, there are many options here but mostly the open source projects go with MIT because it gives a complete right of usage of the package.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;readme.md is a file you should carefully describe your package in, that's the entry point if someone sees your package and you should carefully and completely explain &amp;amp; document the package. &lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;thanks for reading this article :)&lt;/p&gt;

</description>
      <category>node</category>
      <category>opensource</category>
      <category>npm</category>
      <category>architecture</category>
    </item>
    <item>
      <title>MySQL's Data Streaming : What is it &amp; How it works ?</title>
      <dc:creator>Ali Amjad </dc:creator>
      <pubDate>Sat, 11 Jun 2022 09:27:35 +0000</pubDate>
      <link>https://dev.to/ali_a_koye/mysqls-data-streaming-what-is-it-how-it-works--654</link>
      <guid>https://dev.to/ali_a_koye/mysqls-data-streaming-what-is-it-how-it-works--654</guid>
      <description>&lt;p&gt;In the article, I will be discussing one of the most interesting features of MySQL that you gonna need all the time and discuss how it works in the background. So let's begin 😋&lt;/p&gt;




&lt;p&gt;&lt;strong&gt;I - Introduction to Data Streaming&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;So let's begin with What is Data? , it's information that has been translated into a form that is efficient for movement or processing. Relative to today's computers and transmission media, data is information converted into binary digital form. &lt;br&gt;
Based on statistics, the amount of text data created every day across the globe is more than 18 billion.  So data is being created and transferred between computers all across the globe all the time.&lt;br&gt;
Data can be transferred in multiple methods, to understand the data streaming transfer, we need to understand the traditional way and learn why we need something like streaming.&lt;/p&gt;

&lt;p&gt;A common way to transfer a file is that all the packets must arrive at the destination to be reassembled and then reach their destination, an example is when you send an image and you need all bits of this image to display it thus it must be delivered first.&lt;/p&gt;

&lt;p&gt;However, if you are waiting for a video to load, are you waiting for the full video to download and play?  Well not all of us, that's why we have streaming services like Youtube, Netflix, and others and you can start playing any of them right away, and that's where a streaming idea comes to play.&lt;/p&gt;

&lt;p&gt;What makes streams unique, is that instead of a program reading a file into memory all at once like in the traditional way, streams read chunks of data piece by piece, processing its content without keeping it all in memory.&lt;br&gt;
Instead of waiting for the 100 MB video to load, we can get it to chunk by chunk and load each 1MB as they are consumed from the stream and start displaying it right away.&lt;/p&gt;




&lt;p&gt;&lt;strong&gt;II - What do we mean by Database's Data Streaming?&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;As we have a clear understanding of what we mean by data streaming, and how it will help us in the database world is our next topic.&lt;/p&gt;

&lt;p&gt;Let's say we have a table from SQL that has some data in it ( roughly 1000 ) and you make a select statement to retrieve them for some O(N) calculation, it's fast enough that you don't feel any delay for your report.&lt;/p&gt;

&lt;p&gt;However, let's say we have 1B rows ... your select statement takes enough time and your O ( N ) calculation is taking extra time as it has to wait for the rows to be retrieved and then start going through all of them to do the calculations.&lt;/p&gt;

&lt;p&gt;Now the fun part, how can we improve this? Yes, you got it right, Let's see how streams for this select statement will help us.&lt;br&gt;
Instead of waiting for 1 Billion rows, let's get a row one by one so when a row is fetched by the DB, We make some calculations on it right away as they are a chunk of this data and we process chunk by chunk, and then we send it back to the user before receiving next chunk.&lt;/p&gt;

&lt;p&gt;By the end of this fetching, you will have a 1B calculated rows that are sent back to the user and the user didn't wait at all, and We only optimized an O ( N ) calculation, this will hugely improve your more complex calculations.&lt;/p&gt;

&lt;p&gt;Database Streaming will reduce the user waiting time and optimize your calculations on the huge amount of data.&lt;/p&gt;

&lt;p&gt;Note: This is the explanation for our Article only as we get deeper in the next sections, Database streaming also has a huge impact on the Data science of creating Pipelines and data lakes but that's for another article &lt;/p&gt;




&lt;p&gt;&lt;strong&gt;III - Cases you can solve with MySQL's Data Streaming&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Streaming your database data can have many applications. I usually use it for calculations and reports required by the business that requires calculations on Billions of rows and must be done in a fast way.&lt;/p&gt;

&lt;p&gt;Maybe you are creating an Excel File for big data, or a PDF for a huge report then inserting one into them as they are fetched is way faster.&lt;/p&gt;

&lt;p&gt;You may want to have a video player and store the binary data in a database and you can stream it back to the user. You may have a gallery that images fetched from DB, you can display an image by image or much more applications. &lt;/p&gt;

&lt;p&gt;If you are a data scientist and trying to create a Pipeline by yourself to migrate data between two databases, then you can stream them daily to be in sync, Or if you are looking for a data lake to change the data then you can modify chunk by chunk while streaming the data.&lt;/p&gt;

&lt;p&gt;In short, you can improve your current structure to be much faster for any case.&lt;/p&gt;




&lt;p&gt;&lt;strong&gt;IV - Introduction to Knex.js (NPM package for Node.js)&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Before coding our streamer, we will be using the Knex.js library as our query builder.&lt;/p&gt;

&lt;p&gt;Knex.js (You know they also pronounce the "K" in the front? lol) is a "batteries included" SQL query builder for PostgreSQL, CockroachDB, MSSQL, MySQL, MariaDB, SQLite3, Better-SQLite3, Oracle, and Amazon Redshift designed to be flexible, portable, and fun to use.&lt;/p&gt;

&lt;p&gt;They provide a beautiful way to use SQL in your Node.js, You can refer to their official documentation to know more about this amazing product.&lt;/p&gt;

&lt;p&gt;You can check the documentation Here: &lt;a href="https://knexjs.org/" rel="noopener noreferrer"&gt;Knex.js Documentation&lt;/a&gt;&lt;br&gt;
and You can check the Streaming Documentation: &lt;a href="https://knexjs.org/guide/interfaces.html#streams" rel="noopener noreferrer"&gt;Knex.js Stream Documentation&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;One of the Knex.js features is Streams for MySQL, You can stream your query and modify it with javascript in a very easy way.&lt;/p&gt;

&lt;p&gt;However, Knex uses streams that are provided by original Mysql npm for node.js which is made by Felix Geisendörfer and his amazing team that made it easier for Knex to use their streaming into their library. We will discuss how the Mysql package achieved this in later sections.&lt;/p&gt;




&lt;p&gt;&lt;strong&gt;V - Introduction to Streaming in Node.js&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;One last thing worth mentioning is the node.js Streaming module, for the implementation we will be doing in the next section, node.js streaming has also its role in it as well as the functionality from MySQL provides, so let's briefly explain what is node.js stream?&lt;/p&gt;

&lt;p&gt;The Stream module is a native module that is shipped by default in Node.js. The Stream is an instance of the EventEmitter class, which handles events asynchronously in Node.js. Due to their superclass, streams are inherently event-based.&lt;/p&gt;

&lt;p&gt;There are 4 types of streams in Node.js:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Writable&lt;/strong&gt;: Used to write data sequentially&lt;br&gt;
&lt;strong&gt;Readable&lt;/strong&gt;: Used to read data sequentially&lt;br&gt;
&lt;strong&gt;Duplex&lt;/strong&gt;: Used to both read and write data sequentially&lt;br&gt;
&lt;strong&gt;Transform&lt;/strong&gt;: Where data can be modified when writing or reading. Take compression for an example, with a stream like this you can write compressed data and read decompressed data.&lt;/p&gt;

&lt;p&gt;This is briefly about the node.js streaming module, for more information you can read all about streams at &lt;a href="https://nodejs.org/api/stream.html" rel="noopener noreferrer"&gt;Node.js Official documentation.&lt;/a&gt;&lt;/p&gt;




&lt;p&gt;&lt;strong&gt;VI - Implementation of MySQL's Data Streaming with Node.js&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;In this section, we will be coding our streamer with the Knex.js Package, so let's begin right away.&lt;/p&gt;

&lt;p&gt;As a side note, Node.js and Knex.js package basic usage knowledge is required because I will be focusing on the stream only throughout the coding.&lt;/p&gt;

&lt;p&gt;First, I will be creating a file called "stream.js" with an async function called "sample" that gonna be our sample in this article.&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;

&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;database&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;require&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;./database/connection&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;


 &lt;span class="k"&gt;async&lt;/span&gt; &lt;span class="kd"&gt;function&lt;/span&gt; &lt;span class="nf"&gt;sample&lt;/span&gt;&lt;span class="p"&gt;(){&lt;/span&gt;

    &lt;span class="nx"&gt;console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;log&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Hi , This is a sample function&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;

 &lt;span class="p"&gt;}&lt;/span&gt;

 &lt;span class="nf"&gt;sample&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;



&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;we have knex mysql connection at top , and i can run this file with "node stream.js"&lt;/p&gt;

&lt;p&gt;Then , I will be creating a table from SQL so that we can write queries on .. i will quickly write migration for it and add some test data into it.&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;

&lt;span class="k"&gt;CREATE&lt;/span&gt; &lt;span class="k"&gt;TABLE&lt;/span&gt; &lt;span class="nv"&gt;`sample`&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;
  &lt;span class="nv"&gt;`id`&lt;/span&gt; &lt;span class="nb"&gt;int&lt;/span&gt; &lt;span class="nb"&gt;unsigned&lt;/span&gt; &lt;span class="k"&gt;NOT&lt;/span&gt; &lt;span class="k"&gt;NULL&lt;/span&gt; &lt;span class="n"&gt;AUTO_INCREMENT&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="nv"&gt;`name`&lt;/span&gt; &lt;span class="nb"&gt;varchar&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;255&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;NOT&lt;/span&gt; &lt;span class="k"&gt;NULL&lt;/span&gt; &lt;span class="k"&gt;DEFAULT&lt;/span&gt; &lt;span class="s1"&gt;''&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="k"&gt;PRIMARY&lt;/span&gt; &lt;span class="k"&gt;KEY&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nv"&gt;`id`&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="p"&gt;)&lt;/span&gt; 


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;I did add some dummy data into this database , around 3000 records&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;

&lt;span class="k"&gt;SELECT&lt;/span&gt; &lt;span class="k"&gt;COUNT&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;*&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;FROM&lt;/span&gt; &lt;span class="n"&gt;sample&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt; &lt;span class="o"&gt;//&lt;/span&gt;&lt;span class="k"&gt;returns&lt;/span&gt; &lt;span class="mi"&gt;3000&lt;/span&gt;


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Now i will use Knex.js Stream function to modify each one as they are fetched&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;

&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;database&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;require&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;./database/connection&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;

&lt;span class="k"&gt;async&lt;/span&gt; &lt;span class="kd"&gt;function&lt;/span&gt; &lt;span class="nf"&gt;sample&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="nx"&gt;console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;log&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Started At :&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nc"&gt;Date&lt;/span&gt;&lt;span class="p"&gt;().&lt;/span&gt;&lt;span class="nf"&gt;toISOString&lt;/span&gt;&lt;span class="p"&gt;());&lt;/span&gt;
  &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nf"&gt;database&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;sample&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;select&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
    &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;stream&lt;/span&gt;&lt;span class="p"&gt;((&lt;/span&gt;&lt;span class="nx"&gt;stream&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="nx"&gt;stream&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;on&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;data&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;row&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="nx"&gt;console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;log&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;row&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;name&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s2"&gt;`At : &lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nc"&gt;Date&lt;/span&gt;&lt;span class="p"&gt;().&lt;/span&gt;&lt;span class="nf"&gt;toISOString&lt;/span&gt;&lt;span class="p"&gt;()}&lt;/span&gt;&lt;span class="s2"&gt;`&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
      &lt;span class="p"&gt;});&lt;/span&gt;
    &lt;span class="p"&gt;});&lt;/span&gt;
    &lt;span class="nx"&gt;console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;log&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Ended At :&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nc"&gt;Date&lt;/span&gt;&lt;span class="p"&gt;().&lt;/span&gt;&lt;span class="nf"&gt;toISOString&lt;/span&gt;&lt;span class="p"&gt;());&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;

&lt;span class="nf"&gt;sample&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;



&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Running this code will print the rows as they are fetched from the DB with the exact date start and end date, you can make the difference between them.&lt;/p&gt;

&lt;p&gt;What we did, we simply write a select * for this table, and then we used the .stream function provided by knex.js, then we are listening on "data" which indicates when each row arrived, there are other events like on Error to handle the error occurrence in the stream.&lt;/p&gt;

&lt;p&gt;This is an example of the output :&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmixg2dv0tuifq5uo8zdu.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmixg2dv0tuifq5uo8zdu.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;




&lt;p&gt;&lt;strong&gt;VII - How MySQL's Data Streaming works and How it's implemented with Node.js?&lt;/strong&gt; 😱&lt;/p&gt;

&lt;p&gt;Finally, let's discuss how this works in the background and how this stream works behind the scenes.&lt;/p&gt;

&lt;p&gt;First, Knex.js is a Query builder that only provides a stream interface, in other meaning that they have added another layer on the top of a feature to make it easier to be used by programmers and that's why it's kinda difficult to know how it works behind the scenes from Knex.js Documentation.&lt;/p&gt;

&lt;p&gt;Originally Stream feature comes from the original MySQL-node client package which knex.js is depending on for MySQL. the MySQL package does provide a brief on how it's working in their documentation, You can read it from the &lt;a href="https://www.npmjs.com/package/mysql#streaming-query-rows" rel="noopener noreferrer"&gt;MySQL stream&lt;/a&gt;. &lt;/p&gt;

&lt;p&gt;However to this point we have explained, it's unclear if it's native MySQL functionality or something made possible with node.js only.&lt;/p&gt;

&lt;p&gt;So let's dive deeper to see how this part is coded and we could get a lead from there.&lt;/p&gt;

&lt;p&gt;Mysql for node.js is an open-source package, so you can visit how it's made through their &lt;a href="https://github.com/mysqljs/mysql" rel="noopener noreferrer"&gt;GitHub repository&lt;/a&gt; , after wandering around for some time, you will get some leads from their implementation.&lt;/p&gt;

&lt;p&gt;if you look into this &lt;a href="https://github.com/mysqljs/mysql/blob/master/lib/protocol/sequences/Query.js" rel="noopener noreferrer"&gt;file where the code relies on&lt;/a&gt; , you see they have used a MySql's Text Protocol called "COM_QUERY" to make this work, so let's dive more deeply into what's this protocol do? &lt;/p&gt;

&lt;p&gt;COM_QUERY (SELECT statement message parsing) is one of the MySQL communication Text Protocols, let's focus on what do we mean by Text Protocol  and let's compare it to Binary Protocol :&lt;br&gt;
The difference is really whether the protocol is oriented around data structures or text strings, for example, HTTP is a text protocol, even though when it sends a jpeg image, it just sends the raw bytes, not a text encoding of them.&lt;br&gt;
So basically by Text protocols in MySQL, we can send and receive data without any encodings and the benefit of Com Query is that we can parse the text to extract our needs.&lt;br&gt;
You can get a list of MySql's &lt;a href="https://dev.mysql.com/doc/internals/en/client-server-protocol.html" rel="noopener noreferrer"&gt;Communication Protocols&lt;/a&gt;, and a List of &lt;a href="https://dev.mysql.com/doc/internals/en/text-protocol.html" rel="noopener noreferrer"&gt;MySQL Text Protocols&lt;/a&gt; at their official Documentation.&lt;/p&gt;

&lt;p&gt;Back to COM_QUERY, Let's get into more advanced details on how it works : &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 1)  Client Command or Client Side:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;A  COM_QUERY is used to send the server a text-based query that is executed immediately, in other meaning that when you provide this "Select *" and chain it to the Stream function, it will send this query to the server and then start executing right away.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 2) The MySQL server responds with 4 packet possibilities&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;1- If there is an execution error, such as SQL syntax error, return the ERR package&lt;/p&gt;

&lt;p&gt;2- If the execution is successful, but no data is found, return the OK package&lt;/p&gt;

&lt;p&gt;3- If the client executes load data local infile 'filename' into table &lt;/p&gt;
&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;, LOCAL_INFILE_REQUEST is returned.

&lt;p&gt;4- If a result set ( means there was some data ) is returned, the delivered package is Resultset.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5n4qf0r1udtkn6mqe0u5.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5n4qf0r1udtkn6mqe0u5.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 3) Let's focus on the Result Set&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;The Result set means the Data is Found, which means we are getting a sequence of packets.&lt;/p&gt;

&lt;p&gt;The Result Set is the combination of two parts that we receive,&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;First is the column definitions&lt;/strong&gt;, which contain information about the columns and the data types and schema details in general.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Second is the Rows&lt;/strong&gt;, Each Row is a packet and we get the rows as packets.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The structure of the ResultSet response packet is as follows:&lt;/p&gt;

&lt;p&gt;1 - The first packet: is the column length packet.&lt;/p&gt;

&lt;p&gt;2 - Followed by n field description packets, each field description is a packet, After all, field information descriptions, an EOF packet or OK packet will be sent as the separator between the field definition and the data (Row).then the OK packet is returned&lt;/p&gt;

&lt;p&gt;3 - Next is the line data packet, one data packet per line, including the packet header and the message body.&lt;/p&gt;

&lt;p&gt;4 - The final end packet, which may also be an EOF or OK packet&lt;/p&gt;

&lt;p&gt;In short and more generally speaking, after Com Query has been executed ... we receive a sequence of data packets that we each of them is a row.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fppt057g83mnwip74oo6o.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fppt057g83mnwip74oo6o.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;So let's focus on the MySQL package and how they have used that, after providing a string query, they will send a com query to the server ... they receive the row packets and for each packet, they parse it to extract the row that is fetched and they use Node.js streams to emit them as events right away and you listen on "Data" to get the rows. &lt;/p&gt;

&lt;p&gt;Well, that was a lot of theoretical explanation xD, But now we have a better understanding that it's working because of a MySql protocol.&lt;/p&gt;




&lt;p&gt;&lt;strong&gt;VIII - Database Data Streaming with Other Databases and Languages&lt;/strong&gt; &lt;/p&gt;

&lt;p&gt;As for the Databases, I am not sure how many other databases support this but my research and playground was only MySQL.&lt;/p&gt;

&lt;p&gt;As for other Languages than Node.js, this is only something made friendlier with the node.js streams module ... this is achievable with other languages as well, I have seen Java examples doing the same thing.&lt;/p&gt;




&lt;p&gt;&lt;strong&gt;IX - Conclusion&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;In Article we discussed A lot About Streams, Now you should have some level of understanding of what streams are in databases. We have provided some examples and a deep explanation of what happens in the background.&lt;br&gt;
Please refer back to MySQL Documentation or any links provided above to learn more about this.&lt;/p&gt;




&lt;p&gt;&lt;strong&gt;X - Thank You&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Thanks for reading this article, this is one of the Articles I enjoyed when I was writing it.&lt;br&gt;
hopefully, you gained new information when you read through it.&lt;/p&gt;

&lt;p&gt;You can Go through my Account for more articles.&lt;/p&gt;



&lt;br&gt;
&lt;/table&gt;&lt;/div&gt;

</description>
      <category>mysql</category>
      <category>node</category>
      <category>database</category>
      <category>datascience</category>
    </item>
    <item>
      <title>A Logger : What to consider when creating a Logger with Node.js</title>
      <dc:creator>Ali Amjad </dc:creator>
      <pubDate>Mon, 06 Jun 2022 14:28:07 +0000</pubDate>
      <link>https://dev.to/ali_a_koye/a-logger-what-to-consider-when-creating-a-logger-with-nodejs-25ed</link>
      <guid>https://dev.to/ali_a_koye/a-logger-what-to-consider-when-creating-a-logger-with-nodejs-25ed</guid>
      <description>&lt;p&gt;Lately at work, we were facing an issue that the systems that were interacting with each other have grown into more complicated systems, it was getting difficult and more difficult to track the flow between them.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;we needed more visibility of these systems which was planned to help us  in areas such as :&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;1- Easy to Debug Issues when something fails &lt;br&gt;
2- Easy to identify issues that are occurring in the run time.&lt;br&gt;
3- Reducing Debugging time&lt;/p&gt;

&lt;p&gt;So that's how we have decided on a Logger, We needed a General and Centralized Logger that we could add more visibility.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;What can you log?&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;To be honest you can log as much as pieces of information you think are possible to have.&lt;br&gt;
the way you decide for creating your own logger is to store what you need, in our case :&lt;/p&gt;

&lt;p&gt;1 - we are keeping track of a record that was changing between multiple systems, status changes of this record and we are storing it. &lt;/p&gt;

&lt;p&gt;2- we are storing the execution of the code, and the parameters changing between systems.&lt;/p&gt;

&lt;p&gt;3- we are doing some heavy calculations and storing them so it will be easier to get more complicated reports from this logger&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;in your case :&lt;/strong&gt;&lt;br&gt;
You can track your users across your systems, track requests,  Track System behavior, Track System activities, and much more.&lt;/p&gt;

&lt;p&gt;A Logger can help you in your business decisions, you can use them in BI tools or any data science procedures.&lt;/p&gt;

&lt;p&gt;However, there are some things to consider while creating a logger in Node.js.&lt;br&gt;
You may store too much information and too much interaction with I/O with any device or with any database.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;1 - Don't Block your Request-Response Cycle and Main Thread ( Node.js )&lt;/strong&gt; &lt;/p&gt;

&lt;p&gt;A ) Don't "Await" each one of your logger inserts, you know it will stop at each line to get success then next line? &lt;br&gt;
You can parallelize them with Promise.all() and use Single await for them, the tasks will run in parallel.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;Await&lt;/span&gt; &lt;span class="nb"&gt;Promise&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;all&lt;/span&gt;&lt;span class="p"&gt;([&lt;/span&gt;&lt;span class="nx"&gt;promise1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;promise2&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;promise3&lt;/span&gt;&lt;span class="p"&gt;])&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;B) Let's make this better, do you know this won't get to the line until they all run in parallel and all of them succeed?  &lt;/p&gt;

&lt;p&gt;Node.js is single-threaded but it will launch other threads in the background so that they run without blocking your code, HMMMMM let's make turn this to benefit our use case but how?&lt;br&gt;
in Node.js, we have to Await as well as .then to resolve promises ... so simply instead of Await and waiting for a task to complete, just add a .then() to it so that it will resolve, The Cool part is that the code execution will continue and will execute other things and this specific task runs in the background.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nb"&gt;Promise&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;all&lt;/span&gt;&lt;span class="p"&gt;([&lt;/span&gt;&lt;span class="nx"&gt;promise1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;promise2&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;promise3&lt;/span&gt;&lt;span class="p"&gt;]).&lt;/span&gt;&lt;span class="nx"&gt;then&lt;/span&gt;&lt;span class="p"&gt;((&lt;/span&gt;&lt;span class="nx"&gt;values&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="nx"&gt;console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;log&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;values&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="p"&gt;});&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;C ) That's really cool, now you have a nonblocking logger that runs in the background, but still the request won't end because there is something in the background that is executing, and it's taking too much time.&lt;/p&gt;

&lt;p&gt;There is a solution for that,  You need to run your loggers after the response has been sent back to the client. in Node.js, the code will continue executing even after ending the request which is basically what we want, to not make the users wait.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;res&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;send&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;Hello World!&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="c1"&gt;//DO SOMETHING HERE &lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;2- If you are storing in the database, Don't overwhelm the DB with too much inserting, Best practice is to use batch Insert or batch Update.&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Or You can use Queues to reduce the load and reduce the Database CPU Usage ( if it easily hits the maximum )&lt;/p&gt;

&lt;p&gt;Example: AWS SQS service&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;3- Don't overwhelm your Server with too Many HTTP Requests&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;if you are storing a user location, it's a bad idea to track the user for each second and let the server know to store a logger.&lt;br&gt;
You can hit the Server for example per 20 seconds, or you can open other connections that HTTP like Socket Channel to send the logs and reduce the load.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;4- Try To save meaningful data and try to think in the long term about where you may use these logs in a more effective way.&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;in the conclusion, This is how I approached this problem of Logging the things I want in an effective way that its Logging ( First Priority ) and its Not slowing down anything ( Second Priority  ) &lt;/p&gt;

</description>
      <category>node</category>
      <category>architecture</category>
      <category>tutorial</category>
      <category>logger</category>
    </item>
    <item>
      <title>SQL: One to Many Join Query in an Optimal way with Node.js</title>
      <dc:creator>Ali Amjad </dc:creator>
      <pubDate>Mon, 16 May 2022 21:15:43 +0000</pubDate>
      <link>https://dev.to/ali_a_koye/sql-one-to-many-join-query-in-an-optimal-way-with-nodejs-2da2</link>
      <guid>https://dev.to/ali_a_koye/sql-one-to-many-join-query-in-an-optimal-way-with-nodejs-2da2</guid>
      <description>&lt;p&gt;For the past few years, one of the queries that I needed most was making one too many queries to return a more informative response back to the client.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;So what is One to Many Relationships in SQL?&lt;/strong&gt; &lt;/p&gt;

&lt;p&gt;Formally, A One to many relationship is a type of cardinality that refers to the relationship between two entities A and B in which an element of A may be linked to many elements of B, but a member of B is linked to only one element of A.&lt;/p&gt;

&lt;p&gt;Along with One to Many, we Have &lt;strong&gt;One to One&lt;/strong&gt; and &lt;strong&gt;Many to Many&lt;/strong&gt; Relationships as well but we are on focusing on One To Many in this Article.&lt;br&gt;
&lt;strong&gt;One to One relations&lt;/strong&gt; are easier as only one record is connected with another one, You could retrieve the data you want in both tables with a single join. &lt;br&gt;
&lt;strong&gt;Many to Many Relation&lt;/strong&gt; is made of two One To Many Relationships so the Idea we are discussing here can be applied to that as well.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;So What are we discussing here?&lt;/strong&gt;&lt;br&gt;
Imagine you have a User and User Hobbies Table, Each User Has Many Hobbies and Each Hobby belongs to only one user.&lt;br&gt;
If you make a query over the Hobbies table then I could easily Join it with Users to retrieve the User's information related to a Hobby.&lt;br&gt;
But is it helpful? a list of Hobbies? Actually in most cases No, You want a List of Users with their Hobbies and we preferably want to have an array of User Objects Each one of them should have a property which is an Array of Hobbies embedded in the user object.&lt;/p&gt;

&lt;p&gt;Ok, but can't we just make the Join on the Users table with the Hobbies to fetch the Hobbies with the respect to the user? Again No, SQL does not work that way and it is not like NoSQL databases and does not support this kind of Query.&lt;/p&gt;

&lt;p&gt;Hmmm using a backend language like Node.js or any other, To Find a List of Users in a separate Query and then Loop through all of them and each iteration to query on the hobbies table and embed it? This Actually Works, you will have the correct result but this is one of the worst approaches as you are hitting the Database in a Loop.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;So let's do this in an Optimal Way that does not hurt our DB and we are not making too many connections to the DB and yet we achieve the same output.&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;First, Am gonna Create these two tables with the below schema on SQL , they are in the simplest form.&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;

&lt;span class="k"&gt;CREATE&lt;/span&gt; &lt;span class="k"&gt;TABLE&lt;/span&gt; &lt;span class="nv"&gt;`user`&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;
  &lt;span class="nv"&gt;`id`&lt;/span&gt; &lt;span class="nb"&gt;int&lt;/span&gt; &lt;span class="k"&gt;NOT&lt;/span&gt; &lt;span class="k"&gt;NULL&lt;/span&gt; &lt;span class="n"&gt;AUTO_INCREMENT&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="nv"&gt;`name`&lt;/span&gt; &lt;span class="nb"&gt;varchar&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;100&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;DEFAULT&lt;/span&gt; &lt;span class="k"&gt;NULL&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="k"&gt;PRIMARY&lt;/span&gt; &lt;span class="k"&gt;KEY&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nv"&gt;`id`&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="p"&gt;);&lt;/span&gt;

&lt;span class="k"&gt;CREATE&lt;/span&gt; &lt;span class="k"&gt;TABLE&lt;/span&gt; &lt;span class="nv"&gt;`user_hobbies`&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;
  &lt;span class="nv"&gt;`id`&lt;/span&gt; &lt;span class="nb"&gt;int&lt;/span&gt; &lt;span class="k"&gt;NOT&lt;/span&gt; &lt;span class="k"&gt;NULL&lt;/span&gt; &lt;span class="n"&gt;AUTO_INCREMENT&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="nv"&gt;`user_id`&lt;/span&gt; &lt;span class="nb"&gt;int&lt;/span&gt; &lt;span class="k"&gt;NOT&lt;/span&gt; &lt;span class="k"&gt;NULL&lt;/span&gt; &lt;span class="k"&gt;DEFAULT&lt;/span&gt; &lt;span class="s1"&gt;'0'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="nv"&gt;`hobby_name`&lt;/span&gt; &lt;span class="nb"&gt;varchar&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;100&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;DEFAULT&lt;/span&gt; &lt;span class="k"&gt;NULL&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="k"&gt;PRIMARY&lt;/span&gt; &lt;span class="k"&gt;KEY&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nv"&gt;`id`&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="p"&gt;);&lt;/span&gt;


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Now Tables are Created, I will put some Dummy data into them so that we will have something to work with.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzz4nooqkq00a11buhauv.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzz4nooqkq00a11buhauv.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmdpc3he3pr4xbl60w87o.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmdpc3he3pr4xbl60w87o.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;You may have noticed that I am adding User_id without having SQL relationships, its works that way, and it's preferable and we will discuss why in another article.  &lt;/p&gt;

&lt;p&gt;ok let's start coding&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;init new Node.js Project&lt;/li&gt;
&lt;li&gt;I will be using two packages 

&lt;ol&gt;
&lt;li&gt;
&lt;a href="https://knexjs.org/" rel="noopener noreferrer"&gt;Knex.js&lt;/a&gt; as Query Builder&lt;/li&gt;
&lt;li&gt;
&lt;a href="https://lodash.com/" rel="noopener noreferrer"&gt;Lodash&lt;/a&gt; for Computation &lt;/li&gt;
&lt;/ol&gt;
&lt;/li&gt;
&lt;li&gt;Start&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;First I will be creating a Function and Fetch All of the Users, Please refer to Knex.js Documentation to know how to make connections and start querying.&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;

&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;_&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;require&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;lodash&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;lodash&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;require&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;lodash&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;database&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;require&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;/database/connection&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt; &lt;span class="c1"&gt;//DB Connection&lt;/span&gt;

&lt;span class="kd"&gt;function&lt;/span&gt; &lt;span class="nf"&gt;getAllUsersWithHobbies&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="p"&gt;{}&lt;/span&gt;

&lt;span class="nf"&gt;getAllUsersWithHobbies&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Lets Fetch All of the Users first.&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;

&lt;span class="k"&gt;async&lt;/span&gt; &lt;span class="kd"&gt;function&lt;/span&gt; &lt;span class="nf"&gt;getAllUsersWithHobbies&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;users&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nf"&gt;database&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;user&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nf"&gt;select&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;    
&lt;span class="p"&gt;}&lt;/span&gt;


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Now you have an Array of Users , So what should you do next is to extract their Ids.&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;

&lt;span class="k"&gt;async&lt;/span&gt; &lt;span class="kd"&gt;function&lt;/span&gt; &lt;span class="nf"&gt;getAllUsersWithHobbies&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;users&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nf"&gt;database&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;user&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nf"&gt;select&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;  &lt;span class="c1"&gt;//[ { id: 1, name: 'Ali' } ]&lt;/span&gt;
    &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;usersIds&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;_&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;map&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;users&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;el&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="nx"&gt;el&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;id&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;  &lt;span class="c1"&gt;//[ 1 ]&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Perfect , now we make a query on the Hobbies to find all the Hobbies that their user_id is in our UserIds variable.&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;

&lt;span class="k"&gt;async&lt;/span&gt; &lt;span class="kd"&gt;function&lt;/span&gt; &lt;span class="nf"&gt;getAllUsersWithHobbies&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;users&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nf"&gt;database&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;user&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nf"&gt;select&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;  
    &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;usersIds&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;_&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;map&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;users&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;el&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="nx"&gt;el&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;id&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;  
    &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;Hobbies&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nf"&gt;database&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;user_hobbies&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nf"&gt;select&lt;/span&gt;&lt;span class="p"&gt;().&lt;/span&gt;&lt;span class="nf"&gt;whereIn&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;user_id&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;usersIds&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="c1"&gt;//returns 2&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;The next step is to Group The Hobbies By User_id , using GroupBy Function from Lodash ... it will return an Object that has user_id as keys and Array of the Hobbies belongs to a User as value.&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;


&lt;span class="k"&gt;async&lt;/span&gt; &lt;span class="kd"&gt;function&lt;/span&gt; &lt;span class="nf"&gt;getAllUsersWithHobbies&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;users&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nf"&gt;database&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;user&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nf"&gt;select&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;  
    &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;usersIds&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;_&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;map&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;users&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;el&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="nx"&gt;el&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;id&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;  
    &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;Hobbies&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nf"&gt;database&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;user_hobbies&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nf"&gt;select&lt;/span&gt;&lt;span class="p"&gt;().&lt;/span&gt;&lt;span class="nf"&gt;whereIn&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;user_id&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;usersIds&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; 
    &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;groupedHobbies&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;_&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;groupBy&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;Hobbies&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;user_id&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;

    &lt;span class="c1"&gt;// groupedHobbies: {&lt;/span&gt;
    &lt;span class="c1"&gt;//     '1': [&lt;/span&gt;
    &lt;span class="c1"&gt;//        { id: 1, user_id: 1, hobby_name: 'First Hobby' },&lt;/span&gt;
    &lt;span class="c1"&gt;//        { id: 2, user_id: 1, hobby_name: 'Second Hobby' }&lt;/span&gt;
    &lt;span class="c1"&gt;//     ]&lt;/span&gt;
    &lt;span class="c1"&gt;//   }&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;



&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Finally We loop through our users and we can point to the Hobbies object with the user.id&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;

&lt;span class="k"&gt;async&lt;/span&gt; &lt;span class="kd"&gt;function&lt;/span&gt; &lt;span class="nf"&gt;getAllUsersWithHobbies&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;users&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nf"&gt;database&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;user&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nf"&gt;select&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;  
    &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;usersIds&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;_&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;map&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;users&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;el&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="nx"&gt;el&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;id&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;  
    &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;Hobbies&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nf"&gt;database&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;user_hobbies&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nf"&gt;select&lt;/span&gt;&lt;span class="p"&gt;().&lt;/span&gt;&lt;span class="nf"&gt;whereIn&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;user_id&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;usersIds&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; 
    &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;groupedHobbies&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;_&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;groupBy&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;Hobbies&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;user_id&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;

    &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;usersEmbedded&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;_&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;map&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;users&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;record&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
            &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
                &lt;span class="p"&gt;...&lt;/span&gt;&lt;span class="nx"&gt;record&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
                &lt;span class="na"&gt;hobbies&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;groupedHobbies&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="nx"&gt;record&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;id&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="p"&gt;,&lt;/span&gt;
            &lt;span class="p"&gt;};&lt;/span&gt;
        &lt;span class="p"&gt;});&lt;/span&gt; 

       &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="nx"&gt;usersEmbedded&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;an Example of final output : &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxn25qdakriiod0pr0s7s.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxn25qdakriiod0pr0s7s.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Well, we have achieved the exact output we have wanted by hitting the database only twice to retrieve all the hobbies for all the users and embed them as well. &lt;/p&gt;

&lt;p&gt;This is a simple technique yet so powerful that you want to do all of your embedded this way.&lt;/p&gt;

&lt;p&gt;Thanks for reading this Article &lt;/p&gt;

</description>
      <category>sql</category>
      <category>node</category>
      <category>javascript</category>
      <category>tutorial</category>
    </item>
    <item>
      <title>Simplest Uploader? Creating Powerful Node.js Object Uploader</title>
      <dc:creator>Ali Amjad </dc:creator>
      <pubDate>Thu, 12 May 2022 19:21:21 +0000</pubDate>
      <link>https://dev.to/ali_a_koye/simplest-uploader-creating-powerful-nodejs-object-uploader-359o</link>
      <guid>https://dev.to/ali_a_koye/simplest-uploader-creating-powerful-nodejs-object-uploader-359o</guid>
      <description>&lt;p&gt;Uploading Objects to the server is one of the Key Concepts of backend development and web development in general.&lt;/p&gt;

&lt;p&gt;it's quite rare to see a website without images, most of the time these websites are not static and the images, as well as details, are managed from an Admin Panel Dashboard, Or you have seen Forms in websites to attach a file with the details.&lt;/p&gt;

&lt;p&gt;An Uploader is your way on how to let the server handle the upcoming files from the client-side.&lt;/p&gt;

&lt;p&gt;in this article, we will discuss creating a file uploader that is not limited to a type of file but rather works for almost all of the file types and without &lt;strong&gt;using any extra package&lt;/strong&gt;.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Nowadays approaches for creating an Uploader versus what we are building&lt;/strong&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;There are two common methods to send a file to the server ( the whole file, so no streams here )&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Sending File as Form-data: For example, submitting an HTML form with attached files to it will be labeled as Multipart.&lt;br&gt;
This usually requires a body parser that can parse this complex data from the body such as &lt;a href="https://www.npmjs.com/package/multer"&gt;Multer&lt;/a&gt;. &lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Sending File as String: Such as converting the File to base64 encoding and sending it as JSON in the Body.&lt;br&gt;
one thing you should be aware of, Base64 encoding writes each 3 bits as 4 bits which increases the file size by up to %30.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;What we are doing :&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;We won't be using any parser for the multi-part to keep it simple, thus we accept the base64 or image text area.&lt;/li&gt;
&lt;li&gt;even though the size increases, you shouldn't use this method for very large files and it doesn't make a lot of difference in small files(up to 100MB). &lt;/li&gt;
&lt;li&gt;we only use functions of Node.js's Fs modules&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;There are a lot of packages out there, but sometimes your needs are just simple things to accept anything with no restrictions.&lt;/p&gt;

&lt;p&gt;So let's begin&lt;/p&gt;

&lt;p&gt;any file that hits this API should be encoded with base64, you can find many libraries to have base64 uploader in Front-End&lt;/p&gt;

&lt;p&gt;I have got this 1px by 1px image, just to not get the string too large.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--YDIdrCWn--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/tlfb07z17jf76u6b914z.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--YDIdrCWn--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/tlfb07z17jf76u6b914z.png" alt="Image description" width="1" height="1"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;I have converted it to base64 from an online converter &lt;a href="https://www.base64-image.de/"&gt;Base64 encoder&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Let's start Coding &lt;/p&gt;

&lt;p&gt;Creating new node.js project =&amp;gt;&lt;/p&gt;

&lt;p&gt;First I will run this to create a node.js project&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;npm init
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--A8kgmMAn--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/1s9yxpc4ts2d9yi7t6pq.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--A8kgmMAn--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/1s9yxpc4ts2d9yi7t6pq.png" alt="Image description" width="880" height="340"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Then I will create index.js as our playground for our uploader.&lt;/p&gt;

&lt;p&gt;At First, I will import Node.js's Fs core module and put our base64 example in the file.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;fs&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;require&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;fs&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nx"&gt;promises&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="kd"&gt;let&lt;/span&gt; &lt;span class="nx"&gt;exampleImage&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;data:image/png;base64,
iVBORw0KGgoAAAANSUhEUgAAAAEAAAABCAIAAACQd1PeAAAAAXNSR0IArs4c6QAAAAxJREFUGFdjcOyfCQACfgFqcbSa6QAAAABJRU5ErkJggg==&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Then we will add our function upload, simply we will have a name and which I hardcoded the file suffix for simplicity but usually, base64 encoded string also includes the file type and you can put it as name.${extract the file suffix}.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;
&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;upload&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;async&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;base64&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;fileSuffix&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;png&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
  &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;name&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;`anyFileName.&lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;fileSuffix&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;`&lt;/span&gt;
  &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;dir&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;`&lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;__dirname&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;/file/&lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;name&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;`&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
  &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;base64Data&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;Buffer&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="k"&gt;from&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
        &lt;span class="nx"&gt;base64&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;replace&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sr"&gt;/^data:image&lt;/span&gt;&lt;span class="se"&gt;\/\w&lt;/span&gt;&lt;span class="sr"&gt;+;base64,/&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;""&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;base64&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
      &lt;span class="p"&gt;);&lt;/span&gt; 
  &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nx"&gt;fs&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;writeFile&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s2"&gt;`&lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;dir&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;`&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;base64Data&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="p"&gt;};&lt;/span&gt;

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;then we will have the directory we want to save and basically extract the base64 from the Base64Data and turn it into a buffer.&lt;/p&gt;

&lt;p&gt;lastly we write the data to the specified folder .&lt;/p&gt;

&lt;p&gt;then run the function&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;upload&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;exampleImage&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;this is the result of running the program with&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;node index.js
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;the output is that we have successfully saved an image.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--rFjh-akX--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/dyhw3s292n7o97g68hxx.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--rFjh-akX--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/dyhw3s292n7o97g68hxx.png" alt="Image description" width="880" height="465"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;With only this, you can save any object you want .. you need the base64 and simply what that file is that is uploaded. if it's a PDF or Image or any file.&lt;/p&gt;

&lt;p&gt;Thanks for reading this article.&lt;/p&gt;

</description>
      <category>node</category>
      <category>javascript</category>
      <category>webdev</category>
      <category>tutorial</category>
    </item>
    <item>
      <title>Data Migration Between Two MongoDB Atlas Instances ( Connections )</title>
      <dc:creator>Ali Amjad </dc:creator>
      <pubDate>Mon, 09 May 2022 21:33:33 +0000</pubDate>
      <link>https://dev.to/ali_a_koye/data-migration-between-two-mongodb-atlas-instances-connections--1nao</link>
      <guid>https://dev.to/ali_a_koye/data-migration-between-two-mongodb-atlas-instances-connections--1nao</guid>
      <description>&lt;p&gt;As for how easy it can be to set up a MongoDB Atlas account and database, it will just become harder when you try to move the data around.&lt;/p&gt;

&lt;p&gt;Lately, at our company, we have decided to migrate our current development MongoDB Instance into another Instance and In this article, I will discuss how I approached and solved this task.&lt;/p&gt;

&lt;p&gt;So the challenging part of this Migration was our database design, because of its complexity, the Application is a built upon Microservices and Microservice Architecture that includes around +30 Microservices with +10 Databases that handle Different Groups of Microservices separately, and then each of them contains around hundreds of collections that even each of them contains thousands of data.&lt;/p&gt;

&lt;p&gt;According to the Official Documentation of MongoDb, you could export each collection separately which is not helpful for our case because we are dealing with a lot of collections.&lt;/p&gt;

&lt;p&gt;Another Solution might be using the MongoDump command to dump the whole Database together, but again not that helpful as it's done manually for exporting and importing each of the Databases.&lt;/p&gt;

&lt;p&gt;After enough research regarding the best way to handle this situation, I have come to a tool that could do this in a matter of seconds which I will be discussing below.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;First of All, Prepare your Host and Target MongoDB URI or at least a connection for each of them&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Then, We are using a tool called &lt;a href="https://nosqlbooster.com/"&gt;NoSQL Booster&lt;/a&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;NoSQLBooster is a cross-platform GUI tool for MongoDB v2.6-5.0, which provides a build-in MongoDB script debuggerMongoDB script debugger, and comprehensive server monitoring tools, chaining fluent query, SQL query, query code generator, task scheduling, and much more.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;After you have installed the tool, you will be popped up with adding connections from the first step.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--Cfb_mhKe--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/8dfgte98w62fkstad3es.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--Cfb_mhKe--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/8dfgte98w62fkstad3es.png" alt="Connections pop up" width="880" height="568"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;in this step , I am Connecting via my Atlas URI, so you can simply connect other atlases or any MongoDB from any server.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;you will see your connections loaded in the side like mine in the picture &lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--_5yM-YcG--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/irwr4jv2g3jwzxojfahq.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--_5yM-YcG--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/irwr4jv2g3jwzxojfahq.png" alt="Loaded Schema" width="559" height="593"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Only part left is pretty simple, you copy a database and simply paste it into the other connection. it is simple as that, It will load the DB with all of its data and indexes into the other one.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;A) Copy A Database from the Host Connection&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--mJ87MfRe--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/2dphyiat13vzlj0czug1.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--mJ87MfRe--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/2dphyiat13vzlj0czug1.png" alt="Copy" width="880" height="611"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;B) Paste A Database to the Target Connection&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--Fex5RLct--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/vt6lizqiwml5yfrtsp35.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--Fex5RLct--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/vt6lizqiwml5yfrtsp35.png" alt="Paste" width="880" height="662"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;in my case, I did the copying operation around +10 times and our new database is good to go :).&lt;/p&gt;

&lt;p&gt;Thanks for reading this Article, Just wanted to share this experience I had today in this article.&lt;/p&gt;

&lt;p&gt;can you do all the DBs together with one click? I am not sure, maybe there is a better solution but this is quickest i have found and used. :)&lt;/p&gt;

</description>
      <category>mongodb</category>
      <category>database</category>
      <category>tutorial</category>
    </item>
  </channel>
</rss>
