DEV Community

Masui Masanori
Masui Masanori

Posted on • Updated on

 

[Express][TypeScript] Uploading file 1

Intro

When I upload large files in ASP.NET Core applications, I slice the data and send them, after finishing uploading all sliced data, I merge them.

How about in Node.js applicaitons?

I will try it.

Environments

  • Node.js ver.16.3.0
  • TypeScript ver.4.3.2

Server-side

  • ts-node ver.10.0.0
  • Express ver.4.17.1
  • moment ver.2.29.1

Client-side

  • Webpack ver.5.39.1
  • ts-loader ver.9.2.3
  • webpack-cli ver.4.7.2

Upload ReadableStream (failed)

Because both client-side and server-side JavaScript had had "Stream", so I thought I could send file data by them.

[Client] index.html

<!DOCTYPE html>
<html lang='en'>
    <head>
        <title>Hello</title>
        <meta charset="utf8">
    </head>
    <body>
        <input type="file" id="upload_file_input">
        <button onclick="Page.upload()">Upload</button>
        <script src="./js/main.page.js"></script>
    </body>
</html>
Enter fullscreen mode Exit fullscreen mode

[Client] main.page.ts


export async function upload() {
    const fileElement = document.getElementById('upload_file_input') as HTMLInputElement;
    if(fileElement?.files == null ||
        fileElement?.files.length <= 0) {
        alert('No any files');
        return;
    }
    const readData = await fileElement.files[0].arrayBuffer();
    if(readData == null || readData.byteLength <= 0) {
        return 'Failed reading';
    }
    const buffer = new Uint8Array(readData);
    const stream = new ReadableStream({
        pull: (controller) => {            
            let index = 0;
            let chunkSize = 100;
            function push() {         
                if(buffer.byteLength <= index + 1) {
                    controller.close();
                    return;
                }
                if(buffer.byteLength <= index + chunkSize) {
                    chunkSize = buffer.byteLength - index - 1;
                }
                controller.enqueue(new Blob([buffer.subarray(index, index + chunkSize)]));
                index += chunkSize;
                push();
            }
            return push();
        },
    });

    const response = await fetch('files', {
        method: 'POST',
        mode: 'cors',
        headers: {
            'Content-Type': 'application/octet-stream'
        },
        body: stream
    });
    if(response.ok) {
        console.log('OK');
        console.log(await response.text());
    } else {
        console.error('Failed');
    }
}
Enter fullscreen mode Exit fullscreen mode

[Server] index.ts

import express from 'express';
import fs from 'fs';

const port = 3000;
const app = express();
app.use(express.static('clients/public'));
app.post('/files', (req, res, next) => {
    let buffer: Buffer|null = null;
    req.on('data', (chunk) => {
        if(buffer == null) {
            buffer = Buffer.from(chunk);
        } else {
            buffer = Buffer.concat([buffer, chunk]);
        }
    });

    req.on('end', () => {          
        fs.writeFile('sample.png', buffer as Buffer, err => {
            console.log(err);
        });
        next();
    });
    res.send('OK');
});

app.listen(port, () => {
    console.log(`Example app listening at http://localhost:${port}`)
});
Enter fullscreen mode Exit fullscreen mode

The problem was the "data" event of request of "app.post('/files')" was fired only one time.
Maybe it was because I hadn't been able to send Readable stream as request body in the Web brownsers(ex. Edge, Firefox).

Uploading single file

I also tried uploading the file directly.

[Client] main.page.ts

export async function upload() {
    const fileElement = document.getElementById('upload_file_input') as HTMLInputElement;
    if(fileElement?.files == null ||
        fileElement?.files.length <= 0) {
        alert('No any files');
        return;
    }    
    const response = await fetch('files', {
        method: 'POST',
        mode: 'cors',
        headers: {
            'Content-Type': 'application/octet-stream'
        },
        body: fileElement.files[0]
    });
    if(response.ok) {
        console.log('OK');
        console.log(await response.text());
    } else {
        console.error('Failed');
    }
}
Enter fullscreen mode Exit fullscreen mode

The "data" event was called two or more times and I could save the uploaded file.
I also could use converted data(ex. ArrayBuffer, Uint8Array).

So when I need uploading large files, I can slice them and upload as same as in ASP.NET Core applications.

Upload sliced files

Now I try uploading sliced files in the Express application.

Specs

  1. [Client] Select a file.
  2. [Client] Slice 1. to small(1KB) blobs.
  3. [Client] Send the file name to the Server-side on starting uploading.
  4. [Server] Create a directory for saving sliced blobs.
  5. [Client] Send sliced blobs.
  6. [Server] Received 5. and save into 4.
  7. [Client] Finish uploading all sliced blobs.
  8. [Server] Merged all sliced blobs and generate a file
  9. [Server] Remove 4.

file.types.ts

export type ActionResult = {
    succeeded: boolean,
    errorMessage: string,
}
Enter fullscreen mode Exit fullscreen mode

[Client] main.page.ts

import { ActionResult } from "./file.types";

export async function upload() {
    const fileElement = document.getElementById('upload_file_input') as HTMLInputElement;
    if(fileElement?.files == null ||
        fileElement?.files.length <= 0) {
        alert('No any files');
        return;
    }
    const readData = await fileElement.files[0].arrayBuffer();
    if(readData == null || readData.byteLength <= 0) {
        return 'Failed reading';
    }
    const startResult = await startUploading(fileElement.files[0].name);
    if(startResult.result.succeeded === false) {
        alert(startResult.result.errorMessage);
        return;
    }
    const uploadResult = await uploadChunks(readData, startResult.folderName);
    if(uploadResult.succeeded === false) {
        alert(uploadResult.errorMessage);
        return;
    }
    const postResult = await postUploading(startResult.folderName, fileElement.files[0].name);
    if(postResult.succeeded === false) {
        alert(postResult.errorMessage);
        return;
    }
    alert('OK');
}
async function startUploading(fileName: string): Promise<{ result: ActionResult, folderName: string }> {
    const startResponse = await fetch('files/start', {
        method: 'POST',
        mode: 'cors',
        headers: {
            'Content-Type': 'application/json'
        },
        body: JSON.stringify({ fileName }),
    });
    const responseJson = await startResponse.json();
    return JSON.parse(JSON.stringify(responseJson))
}
async function uploadChunks(fileData: ArrayBuffer, folderName: string): Promise<ActionResult> {
    let index = 0;
    let chunkSize = 1024;
    const buffer = new Uint8Array(fileData);
    while(true) {
        if(buffer.byteLength <= index + 1) {
            console.log('end');

            return { succeeded: true, errorMessage: '' };
        }
        if(buffer.byteLength <= index + chunkSize) {
            chunkSize = fileData.byteLength - index - 1;
        }
        const response = await fetch('files/chunk', {
            method: 'POST',
            mode: 'cors',
            headers: {
                'folderName': folderName,
                'index': index.toString(),
                'Content-Type': 'application/octet-stream'
            },
            body: new Blob([buffer.subarray(index, index + chunkSize)])
        });
        const responseJson = await response.json();
        const result = JSON.parse(JSON.stringify(responseJson)) as ActionResult;
        if(result.succeeded === false) {
            return result;
        }
        index += chunkSize;
    }
}
async function postUploading(folderName: string, fileName: string): Promise<ActionResult> {
    const response = await fetch('files/end', {
        method: 'POST',
        mode: 'cors',
        headers: {
            'Content-Type': 'application/json'
        },
        body: JSON.stringify({ folderName, fileName }),
    });
    const responseJson = await response.json();
    return JSON.parse(JSON.stringify(responseJson)); 
}
Enter fullscreen mode Exit fullscreen mode

[Server] index.ts

import express from 'express';
import fs from 'fs';
import moment from 'moment';
import * as actionResults from './actionResultFactory';

const port = 3000;
const app = express();
// To receive JSON value from client-side
app.use(express.json());
// To receive Blob value from client-side
app.use(express.raw());
app.use(express.static('clients/public'));
app.post('/files/start', async (req, res) => {
    const startUploading = JSON.parse(JSON.stringify(req.body));
    const folderName = await createDirectory(startUploading.fileName);
    res.json({
        result: actionResults.getSucceeded(),
        folderName,
    });
});
app.post('/files/chunk', (req, res) => {
    const itemIndex = req.headers['index'];
    const saveDirectory = req.headers['foldername'];
    if(itemIndex == null ||
        saveDirectory == null) {
        res.json(actionResults.getFailed('No data'));
        return;
    }
    fs.promises.writeFile(`tmp/${saveDirectory}/${itemIndex}_value`, Buffer.from(req.body))
        .then(_ => res.json(actionResults.getSucceeded()))
        .catch(err => res.json(actionResults.getFailed(err)));
});
app.post('/files/end', async (req, res) => {
    const savedTmpFiles = JSON.parse(JSON.stringify(req.body));
    const savedDirectory = `tmp/${savedTmpFiles.folderName}`;
    const dirs = await fs.promises.readdir(savedDirectory, { withFileTypes: true });
    const files = dirs.filter(d => /^[0-9]+_value$/).map(d => {
        return { index: parseInt((d.name.split('_')[0])), name: d.name}
    });
    let buffer: Buffer|null = null;
    for(const d of files.sort((a, b) => a.index - b.index)) {
        var newBuffer = Buffer.from(await fs.promises.readFile(`${savedDirectory}/${d.name}`));
        if(buffer == null) {
            buffer = newBuffer;
        } else {
            buffer = Buffer.concat([buffer, newBuffer]);
        }
    }
    fs.promises.writeFile(`tmp/${savedTmpFiles.fileName}`, buffer as Buffer)
        .then(_ => fs.promises.rm(savedDirectory, { force: true, recursive: true }))
        .then(_ => res.json(actionResults.getSucceeded()))
        .catch(err => res.json(actionResults.getFailed(err)));
});

app.listen(port, () => {
    console.log(`Example app listening at http://localhost:${port}`)
});
async function createDirectory(fileName: string): Promise<string> {
    if((await exists('tmp')) === false) {
        await fs.promises.mkdir('tmp');
    }
    const folderName = `${moment(Date.now()).format('YYYYMMDDHHmmssfff')}_${fileName}`;
    await fs.promises.mkdir(`tmp/${folderName}`);
    return folderName;
}
async function exists(path: string): Promise<boolean> {
    return new Promise(async (resolve) => {
        fs.promises.stat(path)
            .then(s => resolve(true))
            .catch(err => resolve(false));
    });
}
Enter fullscreen mode Exit fullscreen mode

Top comments (0)

An Animated Guide to Node.js Event Loop

Node.js doesn’t stop from running other operations because of Libuv, a C++ library responsible for the event loop and asynchronously handling tasks such as network requests, DNS resolution, file system operations, data encryption, etc.

What happens under the hood when Node.js works on tasks such as database queries? We will explore it by following this piece of code step by step.