Compressing a file before uploading seemed imperative.
The candidates were starting to flow through our system at Brilliant Hire. That ecstatic sense of accomplishment though, brought along a major concern. The upload time was insanely slow and the file size, especially the audio recordings, large! Attempts were ongoing to reduce the size of the live wav
recordings but even then, all the file uploads by the candidate had to be fast! Compressing a file before uploading seemed imperative.
That's when Pako
came to the rescue. We were able to reduce the files to half their size just by using the lowest level of compression and it did not even block the execution thread for the file size we were targeting. A win!
Just the same way, here's how you can upload a compressed file in Vue.js and uncompress the file in Node.js before you perhaps, further pipe it to S3.
Compressing a file in Vue.js
Here, we will allow a user to upload a file using the input
tag. After a file upload is triggered, we shall call the onChange
method which will compress the file before forwarding it to the upload
method. This method will make the upload happen to our file handling API.
<input type="file" :accept="allowedMimes" ref="inputFile" @change="onChange"/>
import pako from 'pako';
function onChange() {
const data = new FormData();
const file = this.$refs.inputFile.files[0];
const reader = new FileReader();
reader.onload = (e) => {
const fileAsArray = new Uint8Array((e.target! as any).result as ArrayBuffer);
const compressedFileArray = pako.deflate(fileAsArray);
const compressedFile = compressedFileArray.buffer;
const dataToUpload = new Blob([compressedFile], {type: file.type});
const fileToUpload = new Blob([dataToUpload], {type: file.type});
data.append('file', fileToUpload, file.name);
this.upload(data);
};
reader.readAsArrayBuffer(file);
}
What's happening here: The FileReader
converts the file into an ArrayBuffer that is converted into a compressed or deflated file. The file is then sent to the upload
method where it will be sent to our API.
Uncompressing and piping the file in Node.js
We will be using Busboy
to handle the file in our back-end which runs on Node.js.
To keep this piece on point I will point you to my other write-up about How to Upload a file in Node.js
. Here, you will find step-by-step instructions on how to handle a file using Busboy
. I will be referring to the hooks mentioned in this post.
If you are back here, I will assume you have read or you already know how to use Busboy. Therefore, we shall get started with the task of uncompressing our file using Pako
.
Step 1: We initiate our Pako
instance and streams. We also initiate the ReadStream
. Pako will push our uncompressed file to this stream to be piped further. This will all happen inside the Busboys
onFile
event.
const pako = require('pako');
busboy.on('file', async (fieldname, file, filename, encoding, mimetype) => {
const fileReadStream = new Readable({
read(size) {
if (!size) this.push(null);
else this.push();
},
});
const inflate = new pako.Inflate(); // you can customize Pako here
inflate.onData = (dat) => {
fileReadStream.push(dat);
};
inflate.onEnd = () => {
fileReadStream.push(null);
};
let nextChunk = null;
file.on('data', async (data) => {
// our Pako gets data from here
});
file.on('end', () => {
// we tell Pako we reached the end here
});
});
Step 2: We now use the file.on('data')
hook to pipe our file to Pako
. We use a buffer variable called nextChunk
since Pako
requires a true
to be sent as a parameter once the file end is reached in file.on(
end) hook
. Below is how.
file.on('data', async (data) => {
if (nextChunk) {
inflate.push(nextChunk, false);
}
nextChunk = data;
});
file.on('end', () => {
inflate.push(nextChunk, true);
});
This should be enough for you to get started with file compression for your own application. For more details you can of course, check out the library's documentation here or ask me!
Hope it helped!
Top comments (0)