You may know Node.js for its power to build highly scalable services, but don't know that it is possible to do much more than just that. Using Node.js we can build incredible tools: from on-demand data processing to building neural networks that are used in machine learning.
The main concepts covered here will be the building CLIs using NodeJS, the use of Node Streams for processing e manipulating files, the native module zlib
for file compression and decompression, and transformation of functions that receive a callback in promise functions.
The final result will be a CLI called npacker
having two simple commands: pack
for compression and unpack
for decompression.
Compression command
Terminal
$ ls -la testfile.txt
-rw-rw-r-- 1 gabrielrufino gabrielrufino 2147483648 mar 14 11:13 testfile.txt
$ npacker pack testfile.txt
$ ls -la testfile.txt testfile.txt.gz
-rw-rw-r-- 1 gabrielrufino gabrielrufino 2147483648 mar 14 11:13 testfile.txt
-rw-rw-r-- 1 gabrielrufino gabrielrufino 2087280 mar 14 11:15 testfile.txt.gz
You may notice a reduction in the size of the compressed file compared to the source file
Decompression command
Terminal
$ ls -la testfile.txt.gz
-rw-rw-r-- 1 gabrielrufino gabrielrufino 2087280 mar 14 11:15 testfile.txt.gz
$ npacker unpack testfile.txt.gz
$ ls -la testfile.txt.gz testfile.txt
-rw-rw-r-- 1 gabrielrufino gabrielrufino 2147483648 mar 14 11:38 testfile.txt
-rw-rw-r-- 1 gabrielrufino gabrielrufino 2087280 mar 14 11:15 testfile.txt.gz
Now you can see the original file generated by the compressed file.
Repository
If you don't want to see the explanation, you can see the final code and contribute to it.
gabrielrufino / npacker
🗜️ Compressor de arquivos feito em Node.js
1. Creating the CLI
The first step is to create the structure of the project and make a binary file visible in the whole system. Fortunately, npm gives us an easy way to do this.
Let's create a folder, initialize an npm project and create the file index.js
Terminal
$ mkdir npacker
$ cd npacker
$ npm init -y
$ touch index.js
These commands generate two important files for our project: the package.json
and the index.js
.
This is the initial state of the package.json
:
package.json
{
"name": "npacker",
"version": "1.0.0",
"description": "",
"main": "index.js",
"scripts": {
"test": "echo \"Error: no test specified\" && exit 1"
},
"keywords": [],
"author": "",
"license": "ISC"
}
The task now is to make index.js a binary, give it an alias and make it visible in any folder on the system. Look at these necessary changes:
index.js
#!/usr/bin/env node
'use strict'
async function main() {
console.log('Let\'s compress!')
}
main()
package.json
{
"name": "npacker",
"version": "1.0.0",
"description": "",
"main": "index.js",
"bin": {
"npacker": "index.js"
},
"scripts": {
"test": "echo \"Error: no test specified\" && exit 1"
},
"keywords": [],
"author": "",
"license": "ISC"
}
Notice that we need to put the line #!/usr/bin/env node
on the top of the index.js
. Moreover, we put the key bin
on package.json
giving the alias npacker
to the index.js
file. We also include the string use strict
to activate the strict mode in the project and created the async function main
to use await
on it.
Finally, we run the command below to make the executable visible in any folder.
Terminal
$ npm link
Now you can execute the command npacker
at any folder. Nice!
Terminal
$ cd ~
$ npacker
Let's compress!
2. Getting the arguments
There are two important arguments that we want to receive from the command line: the operation and the file. The operation can be pack
or unpack
and the file can be any file from any format.
For this, we can use the process.argv
: an array containing all the command line arguments.
Let's see with code:
index.js
#!/usr/bin/env node
'use strict'
async function main() {
console.log(process.argv)
}
main()
Terminal
$ npacker pack music.mp3
[
'/home/gabrielrufino/.nvm/versions/node/v14.16.0/bin/node',
'/home/gabrielrufino/.nvm/versions/node/v14.16.0/bin/npacker',
'pack',
'music.mp3'
]
The first argument is the executor that we've specified at the first line of index.js
. The second argument is the generated link for the binary specified by us in package.json
. These two first arguments don't matter to us.
The two last are the important arguments: the operation (pack
or unpack
) and the file.
We can extract them in an easy way using array destructuring assignment ignoring the two first arguments. Some like this:
index.js
#!/usr/bin/env node
'use strict'
async function main() {
const [,, operation, file] = process.argv
console.log(operation, file)
}
main()
Terminal
$ npacker pack documentation.docx
pack documentation.docx
3. Compressing files
To make the compression, we will need 4 native modules: fs
, stream
, zlib
and util
. Let's import these modules:
index.js
#!/usr/bin/env node
'use strict'
const fs = require('fs')
const stream = require('stream')
const zlib = require('zlib')
const { promisify } = require('util')
async function main() {
const [,, operation, file] = process.argv
console.log(operation, file)
}
main()
Now we can verify if the operation is pack
: the compression operation.
index.js
#!/usr/bin/env node
'use strict'
const fs = require('fs')
const stream = require('stream')
const zlib = require('zlib')
const { promisify } = require('util')
async function main() {
const [,, operation, file] = process.argv
if (operation === 'pack') {
}
}
main()
So far so good. Pay close attention to the next step because it is the most important one so far. We'll work with an important concept in Node.js: the Node Streams.
A stream is an abstract interface for working with streaming data in Node.js. The stream module provides an API for implementing the stream interface.
The definition above is from Node.js Documentation.
Streams are a way to process large data using a smart approach: divide all the data into small packages and process them one by one. The module fs
provides us two methods to read and write data using streams: createReadStream
and createWriteStream
. The module zlib
provides us a method to compress data in gz format: createGzip
. Finally, the stream
module provides us a method to create a logical sequence from reading to writing: pipeline
.
index.js
#!/usr/bin/env node
'use strict'
const fs = require('fs')
const stream = require('stream')
const zlib = require('zlib')
const { promisify } = require('util')
async function main() {
const [,, operation, file] = process.argv
if (operation === 'pack') {
const gzip = zlib.createGzip()
const source = fs.createReadStream(file)
const destination = fs.createWriteStream(`${file}.gz`)
await promisify(stream.pipeline)(source, gzip, destination)
}
}
main()
The intention of the util.promisify
is to transform the function stream.pipeline
in a function that returns Promise instead of a function that receives a callback.
And that's it! Simple as it looks and we can run the following command:
Terminal
$ npacker pack file.txt
4. Decompressing files
This part is the inverse of the last one. The only change is the use of zlib.createUnzip
istead of zlib.createGzip
. Let's see the result:
index.js
#!/usr/bin/env node
'use strict'
const fs = require('fs')
const stream = require('stream')
const zlib = require('zlib')
const { promisify } = require('util')
async function main() {
const [,, operation, file] = process.argv
if (operation === 'pack') {
const gzip = zlib.createGzip()
const source = fs.createReadStream(file)
const destination = fs.createWriteStream(`${file}.gz`)
await promisify(stream.pipeline)(source, gzip, destination)
} else if (operation === 'unpack') {
const unzip = zlib.createUnzip()
const source = fs.createReadStream(file)
const destination = fs.createWriteStream(file.replace('.gz', ''))
await promisify(stream.pipeline)(source, unzip, destination)
}
}
main()
Finally, we can run the command for decompression:
Terminal
$ npacker unpack file.txt.gz
Here we saw one of the wonderful things that Node.js can do other than just services. Thank you very much!
Top comments (0)