Build a simple file watcherđ without external dependencies or packages from NPM
Letâs get started by developing a couple of simple programs that watch files
for changes and read arguments from the command line. Even though theyâre
simple toy program, these applications offer insights into Node.jsâs event-based architecture.
The Power of asynchronous programming
The power of asynchronous coding in Node.js. Taking action whenever a file changes is just plain useful in a number of cases, ranging from automated deployments to running unit tests.
Our File Watcher đ in Action
**const** fs = require('fs');
**const** filename = process.argv[2];
if(!filename) {
throw Error('A file must specified');
}
fs.watch(filename, () **=>** console.log(`File ${filename} changed!` ));
console.log(`Now Watching Your Awesome ${filename} for change `);
The Great Event Loop âĄ
To run the program, Node.js does the following:
⢠It loads the script, running all the way through to the last line, which
produces the Now watching message in the console.
⢠It sees that thereâs more to do because of the call to fs.watch().
⢠It waits for something to happenââânamely, for the fs module to observe a
change to the file.
⢠It executes our callback function when the change is detected.
⢠It determines that the program still has not finished, and resumes waiting.
The Node.js Process
Any unhandled exception thrown in Node.js will halt the process. The exception output shows the offending file and the line number and position of the exception. Itâs pretty common (maybe or mayn't beđ ) in Node.js development to spawn separate processes as a way of breaking up work, rather than putting everything into one big Node.js program. letâs spawn a process in Node.
Spawning a Child Process
'use strict'
**const** fs = require('fs');
**const** spawn = require('child_process').spawn;
**const** filename = process.argv[2];
if(!filename) {
throw Error('A file must specified');
}
fs.watch(filename ,() **=>** {
**const** ls = spawn('ls', ['-l', '-h', filename]);
ls.stdout.pipe(process.stdout);
});
console.log(`Now Watching Your Awesome ${filename} for change `);
The object returned by spawn() is a Child Process. Its stdin , stdout , and stderr properties are Streams that can be used to read or write data. We want to send the standard output from the child process directly to our own standard output stream. This is what the pipe() method does.
The New Child Process In Action
Capture The Data From Stream or Event Emitter
EventEmitter is a very important class in Node.js. It provides a channel for events to be dispatched and listeners to be notified. Many objects youâll encounter in Node.js inherit from EventEmitter , like the Streams we saw in the last section.
'use strict'
**const** fs = require('fs');
**const** spawn = require('child_process').spawn;
**const** filename = process.argv[2];
if(!filename) {
throw Error('A file must specified');
}
fs.watch(filename ,() **=>** {
**const** ls = spawn('ls', ['-l', '-h', filename]);
**let** outpt = '';
_// Added event listener like 'data', 'close'_
ls.stdout.on('data', chunk **=>** outpt += chunk);
ls.on('close', () **=>** {
**const** parts = outpt.split(/|s+/);
console.log(parts[0],parts[4],parts[8]);
});
ls.stdout.pipe(process.stdout);
});
console.log(`Now Watching Your Awesome ${filename} for change `);
The on() method adds a listener for the specified event type. We listen for data events because weâre interested in data coming out of the stream.
A Buffer is Node.jsâs way of representing binary data.It points to a blob of memory allocated by Node.jsâs native core, outside of the JavaScript engine.
Any time you add a non-string to a string in JavaScript (like weâre doing here
with chunk), the runtime will implicitly call the objectâs toString() method.
For a Buffer, this means copying the content into Node.jsâs heap using the default encoding (UTF-8).
Like Stream, the ChildProcess class extends EventEmitter, so we can add listeners to it, as well.
After a child process has exited and all its streams have been flushed, it emits a close event.
Reading and Writing Files In Node.js Asynchronously
we wrote a series of Node.js programs that could watch files for changes. Now letâs explore Node.jsâs methods for reading and writing files. Along the way weâll see two common error-handling patterns in Node.js error events on EventEmitters and err callback arguments.
There are a few approaches to reading and writing files in Node. The simplest is to read in or write out the entire file at once. This technique works well for small files. Other approaches read and write by creating Streams or staging content in a Buffer
'use strict';
**const** fs = require('fs');
fs.readFile('target.txt', (err, data) **=>** {
if (err) {
throw err;
}
console.log(data.toString());
});
Notice how the first parameter to the readFile() callback handler is err. If readFile() is successful, then err will be null. Otherwise the err parameter will contain an Error object. This is a common error-reporting pattern in Node.js, especially for built-in modules.
'use strict';
**const** fs = require('fs');
fs.writeFile('target.txt', 'This is file conntent', (err) **=>** {
if (err) {
throw err;
}
console.log('File saved!');
});
This program writes This is file content to target.txt (creating it if it doesnât exist, or overwriting it if it does). If for any reason the file canât be written, then the err parameter will contain an Error object.
Creating Read and Write Streams
You create a read stream or a write stream by using fs.createReadStream() and fs.createWriteStream(), respectively.
Rebuild Cat program in node
_#!/usr/bin/env node_
'use strict'
require('fs').createReadStream(process.argv[2]).pipe(process.stdout)
Because the first line starts with #!, you can execute this program directly in Unix-like systems. Use chmod to make it executable:
$ chmod +x cat.js
Then, to run it, send the name of the chosen file as an additional argument:
$ ./cat.js target.txt
We have learned how to watch files for changes and to read and write files. we also learned how to spawn child processes and access command-line arguments. Node.js also support synchronous file access but itâs better not to block event-loop unless you know what you are doing.
Thank you. Happy Reading đđ
Top comments (0)