DEV Community

Cover image for Basics of Node.js
Tanmay Agrawal
Tanmay Agrawal

Posted on

Basics of Node.js

I have started learning Node.js from YT. And so I am making notes from my learning, here comes my first set of documentation from the YT tutorials, I try to make my notes as much readable as possible for the general readers as well.
Please, do write in comments if these notes are helpful and what else you would suggest that will make these notes more readable and useful for you.

documentation source : https://nodejs.org/en/docs

How NodeJS differs from vanilla JS ?

  1. Node runs on a server - Not in browser (so it is backend not frontend)
  2. The console is the terminal window;
  3. It has global object, can me checked by making simple console.log(global) whereas in vanilla JS it is the window object.
  4. NodeJS use's commonJS modules instead of the ES6 modules (commonJS uses require statement where as ES6 modules uses import statement)
//common JS import example
const os = require("os");
const path = require("path");

console.log(os.type()); // windows_NT
console.log(os.version()); // windows 10 Home Single Language
console.log(os.homedir()); // c:\\Users\Tanmay
console.log(__dirname); //prints out the folder location in which server.js is kept
console.log(__filename); //prints out the path location of the server.js
console.log(path.dirname(__filename)); // prints out the directory path of the file server.js
console.log(path.extname(__filename)); //gives the extension of the file here it is .js
console.log(path.basename(__filename)); //only basename, here it is server.js
console.log(path.parse(__filename)); //gets and object with value of root, dir, base, ext, and name
Enter fullscreen mode Exit fullscreen mode

Importing the modules

Let us see how we can export and import modules in NodeJS, for this let us assume we have two files, one file is the calculator.js and other is the math.js, for simplicity we make simple calculator functions add, subtract, multiply and divide in the file math.js and then export these functions using
module.exports = {} which will export all the functions inside the object block.

let us see this example
math.js

//Math.js
const add = (a, b) => a + b;
const sub = (a, b) => a - b;
const mul = (a, b) => a * b;
const div = (a, b) => a / b;

//another method to do is
exports.mod = (a, b) => a % b;

module.exports = { add, sub, mul, div };
Enter fullscreen mode Exit fullscreen mode

inside calculator.js now:

calculator.js

// const math = require("./math");
// console.log(math); this will give out the complete object of all functions,

//Another way is destructuring

const { add, sub, mul, div } = require("./math");
Enter fullscreen mode Exit fullscreen mode

Event Emitter in Node.js

In Node.js, the EventEmitter class is a core module that facilitates communication between objects in a publisher-subscriber pattern. It allows for the implementation of event-driven architectures, where certain parts of the code can emit events, and other parts can listen for and respond to those events. The EventEmitter enables decoupling of different parts of an application, making it easier to build scalable and modular systems.

Here's a brief overview of how the EventEmitter works:

  1. Event Subscription: You can create an instance of EventEmitter and use the on method to subscribe to events.
const EventEmitter = require('events'); 
const myEmitter = new EventEmitter();  
myEmitter.on('event', () => {    
    console.log('an event occurred!'); 
});
Enter fullscreen mode Exit fullscreen mode
  1. Event Emission: You can use the emit method to trigger an event.

    myEmitter.emit('event');

  2. Passing Data with Events: You can pass data along with events to provide context or additional information.

myEmitter.on('dataEvent', (data) => {console.log('Received data:', data); });  myEmitter.emit('dataEvent', { key: 'value' });
Enter fullscreen mode Exit fullscreen mode
  1. Error Events: The EventEmitter also has special support for error events, allowing you to handle errors more effectively in asynchronous code.
myEmitter.on('error', (err) => {     console.error('Error occurred:', err); });  myEmitter.emit('error', new Error('Something went wrong'));
Enter fullscreen mode Exit fullscreen mode

The EventEmitter is the foundation for many of the asynchronous operations in Node.js, such as handling HTTP requests, file I/O, and other events. It enables developers to create robust and scalable applications by facilitating the communication and coordination of various components and modules within a Node.js application.

File System Module

Documentation of file-system modules:
https://nodejs.org/docs/latest-v20.x/api/fs.html

The major modules that are used in the file systems are :

  1. read
  2. write
  3. append
  4. unlink (delete)

here are some of the easy examples to understand :

Read files

This is the simplest example, here i had some file with the name starter.txt before I did this

const fs = require("fs");

fs.readFile("./starter.txt", "utf-8", (err, data) => {
  if (err) throw err.message;
  console.log(data); //this will return a buffer <Buffer 48 69 20 6d 79 20 6e 61 6d 65 20 69 73 20 54 61 6e 6d 61 79> if we dont use 'utf-8', however we can change this buffer by doing data.toString() in the readable format
});


//According to the documentation if we have any uncaught excemption then we need to exit, for this will use the process.on
process.on("uncaughtException", (err) => {
  console.log(`uncaught Exception Error : ${err} `);
  process.exit(1);
});
Enter fullscreen mode Exit fullscreen mode

Note: here the readFile is basically the async function, so it will read the file while it is reading it will go ahead and execute the other code that are non blocking execution context

Notice in above we are hard-coding the path i.e, fs.readFile("./starter.txt",...
this is not such a good practice, we can do it in a better way for example like this:
fs.readFile(path.join(__dirname, "starter.txt") here we have imported the path module from path and the __dirname gives out the path of the folder, and we are joining it with our file starter.txt
incase we had another folder lets say files then we would have done this as
fs.readFile(path.join(__dirname, "files", "starter.txt") i.e, first we joint it by folder name and then the file name
[[Basics of NodeJS]]

Write ,Append, Delete and Rename files

const fs = require("fs");
const path = require("path");

fs.readFile(path.join(__dirname, "starter.txt"), "utf-8", (err, data) => {
  if (err) throw err.message;
  console.log(data); //this will return a buffer <Buffer 48 69 20 6d 79 20 6e 61 6d 65 20 69 73 20 54 61 6e 6d 61 79>
});

fs.writeFile(
  path.join(__dirname, "newFile.txt"),
  "Something is written in here",
  (err) => {
    if (err) throw err;
    console.log("Writing complete");
  }
);
//According to the documentation if we have any uncaught excemption then we need to exit, for this will use the process.on 
process.on("uncaughtException", (err) => {
  console.log(`uncaught Exception Error : ${err} `);
  process.exit(1);
});

//Output
//data of the readfile...
//Writing complete
Enter fullscreen mode Exit fullscreen mode

Now as we know that the behavior of the JS here is asynchronous , so now reading, writing and appending to the file could not be predicted, in order to control this we have to then nest one operation inside the other operations callback function.
For example if I have to say, that read the file then write something to the file and append only if that file exist and something had written to it, and then rename the file to newname. Then the code will look like something as follows:

Avoid this type of nesting since it looks like we will be making callback hell

const fs = require("fs");
const path = require("path");

fs.readFile(path.join(__dirname, "starter.txt"), "utf-8", (err, data) => {
  if (err) throw err.message;
  console.log(data); //this will return a buffer <Buffer 48 69 20 6d 79 20 6e 61 6d 65 20 69 73 20 54 61 6e 6d 61 79>
});

//Writing into a new file here

fs.writeFile(
  path.join(__dirname, "newFile.txt"),
  "Something is written in here",
  (err) => {
    if (err) throw err;
    console.log("Writing complete");
    fs.appendFile(
      path.join(__dirname, "newFile.txt"),
      "\n\n Something is appended in here",
      (err) => {
        if (err) throw err;
        console.log("Append complete");
        fs.rename(
          path.join(__dirname, "newFile.txt"),
          path.join(__dirname, "renamedFile.txt"),
          (err) => {
            if (err) throw err;
            console.log("Renaming complete");
          }
        );
      }
    );
  }
);
//According to the documentation if we have any uncaught excemption then we need to exit, for this will use the process.on

process.on("uncaughtException", (err) => {
  console.log(`uncaught Exception Error : ${err} `);
  process.exit(1);
});

//Output
//data of the readfile...
//Writing complete
//Append complete
//Rename complete
Enter fullscreen mode Exit fullscreen mode

This is good now, we can control the behavior of this asynchronous functions, but wait, doesn't it looks like something? Yes, it is the callback hell, and therefore it is necessary to avoid this. In order to do this we have to take advantage of async await method of JS and promises.

const fsPromises = require("fs").promises;
const path = require("path");

const fileOp = async () => {
  try {
    //1. Reading the file
    let data = await fsPromises.readFile(
      path.join(__dirname, "starter.txt"),"utf-8");
    //Loging in console that reading
    console.log(data);
    //Writing the data in another File
    await fsPromises
      .writeFile(path.join(__dirname, "lorem.txt"), "Writing from Promise")
      .then(() => console.log("Writing"));
    //Appending the data that was read from the fist file
    await fsPromises
      .appendFile(path.join(__dirname, "lorem.txt"), ` \n\n ${data}`)
      .then(() => console.log("appending"));
  } catch (err) {
    console.log(err);
  }
};  

fileOp();

process.on("uncaughtException", (err) => {
  console.log(`Uncaught Exception : ${err}`);
  process.exit(1);
});
Enter fullscreen mode Exit fullscreen mode

Read/Write file Stream examples for large chunk of data

const fs = require("fs");
const path = require("path");

const rs = fs.createReadStream("./lorem.txt", "utf-8");
rs.on("data", (chunkData) => console.log(chunkData));

//suppose if we have to write to another file;

const ws = fs.createWriteStream("./newText.txt");

rs.pipe(ws); // it is an efficient method to write on the file from the read data of read stream
//alternatively we could have done it as
// rs.on('data', dataChunk=>{
//     ws.write(dataChunk);
// })
Enter fullscreen mode Exit fullscreen mode

Creating and Deleting File Directory

const fs = require("fs");
const path = require("path");

//1. Creating a directory
//Checking if dir already exists ?
if (!fs.existsSync("./new")) {
  //using mkdir if exists
  fs.mkdir("./new", (err) => {
    if (err) throw err;
    console.log("Directory created");
  });
} else console.log("directory already exists");

//2. Removing the directory
//Checking if dir already exists ?
if (fs.existsSync("./new")) {
  //remove dir if exists
  fs.rmdir("./new", (err) => {
    if (err) throw err;
    console.log("Directory Removed");
  });
} else console.log("directory does not exists");
process.on("uncaughtException", (err) => {
  console.log(`Uncaught Exception : ${err}`);
  process.exit(1);
});
Enter fullscreen mode Exit fullscreen mode

That's it for today, This post will be continued in the next post, I hope these notes can be useful in refreshing the concepts of Node.js once again for all my fellow devs

Top comments (0)