DEV Community

Brendon O'Neill
Brendon O'Neill

Posted on

Handling large file uploads

Introduction

We all have that situation where we learn how to upload form information, and we feel we can upload anything using forms. Then we try to upload a large file to a server, and we receive a 413 status code (Content too large) then we begin to panic. So today I'm going to show you how I handle large file uploads to a server. This isn't the only way to upload files, this is just my most common way that I can reuse this format in most projects when working with large files.

Preparation

This is a normal form that we use when uploading information to a server

<form enctype="multipart/form-data" id="form">
    <label for="userName"> Name
        <input type="text" name="userName" id="userName">
    </label>
    <label for="userName"> Email
        <input type="email" name="userEmail" id="userEmail">
    </label>
    <label for="userFiles"> Files
        <input type="file" name="userFiles" accept="video/*"
        id="userFiles">
    </label>
    <input type="submit" value="Submit" id="formSubmit">
</form>
Enter fullscreen mode Exit fullscreen mode
const submit = document.getElementById("formSubmit");
const form = document.getElementById("form");


submit.addEventListener("click", (event) => {
    event.preventDefault();
    const formData = new FormData(form);
    uploadFiles(formData);
})

async function uploadFiles(formData)
{
    try {
        let res = await fetch('/server',{method:"POST", body:formData, 
        headers: { /* Add Headers */ }})
        if(res.ok)
        {
            // Inform user of successful upload
        }
        else
        {
            throw new Error("Form upload failed")
        }
    } catch (error) {
        console.error(error)
    }
} 

Enter fullscreen mode Exit fullscreen mode

This creates a formData object that retrieves the information from our named inputs and sends it to the server for processing. If the request fails, it returns an error.

This is a nice and easy way to upload files, but what if you have a large file? Do we prevent users from sending over large files? What if we have no choice but to handle a large file? Yes, we can just send it as it is, but what if the file fails uploading halfway through?

These were the questions I asked myself when I was working on a project to upload files for a wedding, and what I used in the end worked without any problems. In that project, what I first did was to handle the files users uploaded. When a user adds a file or files to a form, it is added to the input files object called fileList. If you console.log that object, it looks something like this.

FileList {0: File, 1: File, 2: File, length: 3}

0 : File {name: 'test4.webm', lastModified: 1754044929570,
lastModifiedDate: Fri Aug 01 2025 11:42:09 GMT+0100 (Irish Standard Time),
webkitRelativePath: '', size: 3667214, }

1 : File {name: 'test3.wmv', lastModified: 1754044912367,
lastModifiedDate: Fri Aug 01 2025 11:41:52 GMT+0100 (Irish Standard Time),
webkitRelativePath: '', size: 9338829, }

2 : File {name: 'test.mp4', lastModified: 1754044867215, lastModifiedDate:
Fri Aug 01 2025 11:41:07 GMT+0100 (Irish Standard Time),
webkitRelativePath: '', size: 17839845, }

length : 3
Enter fullscreen mode Exit fullscreen mode

This is an object that contains each file uploaded to the input files tag. Each number is the files uploaded and converted into a File object, these file objects contain information on the file uploaded, and they also contain the raw data of the file. File objects are a subclass of the blob object that bundles the file's information into an object so we can as developers, can use that information to decide how to upload the information.

When I am dealing with files, I add an event listener to the file's input tag so that when the files change, they will be added to an array.

let inputFiles = document.getElementById("userFiles");
let files = [];

inputFiles.addEventListener("change", (event) => {
    fileList = event.target.files;
    for (let i = 0; i < fileList.length; i++) {
        files.push(fileList[i]);
    }
    // files = [File, File, File]
})
Enter fullscreen mode Exit fullscreen mode

Chunking

We have a nice collection of files in our array that we can loop over to send to our server or to modify if the file is too large. Next, I would check the sizes of the files in each file object. This is located in the size value in bytes, so we can decide to convert to megabits or keep it as bytes. How I handled my media files was that if a file was under 10 megabytes, I would leave it alone, as it would be small enough to upload to my server, but if the file was over, I decided to pass the file into a chunking function.

If you have never chunked a file before, it basically is just making smaller files from a larger file by dividing up the raw data into separate blobs or file objects. When chunking a file, you have to plan out how you want it to look after dividing and how to maintain the information and order of the files.

This is an important piece of information. Make sure the chunked files are in order and labelled so that when the server or bucket needs to handle the files, it can sort out the order as if not sorted this will create a corrupt file.

First, decide on a file size for the chunks. I went with 8MB, but you can decide your size, as this will just decrease or increase the number of files created. Then you want to plan out the returned object, as we want to keep them similar to our smaller files, so that when uploading we have the same information. This is the function I created, which I used to chunk my files in my project:


function chunkFile(file)
{
    let mediaChunks = [];
    let start = 0;
    let chunkSize = (1024 * 1024) * 80;
    let end = file.size;
    let id = 0;
    while(start < end)
    {
          id += 1;
          let chunkEnd = start + chunkSize;
          if(chunkEnd > end)
          {
              chunkEnd = end;
          }
          let chunk = new File([file.slice(start,(chunkEnd))],file.name
          ,{type: file.type});

          mediaChunks.push(chunk);
          start = start + chunkSize;
     }

    return {mediaChunks: mediaChunks, type: file.type, name: file.name
    , numberOfChunks: id, size: file.size}
}
Enter fullscreen mode Exit fullscreen mode

In this function, you pass the file into the function and create a few variables to help keep track of the process. We have an array for our files called mediaChunks, which is where I store the newly created files for uploading. The start value is created to keep track of the new start of each file as we slice the file up. The chunkSize is the size of the file I want to create to store in the array. The end value is there to track how big the file is, so that we don't add empty memory, causing a corrupt file. Lastly, ID tracks the number of files created.

The while loop slices the file and generates new files, and adds them to the array. When we reach the last file, we have to make sure that the variable chunkEnd doesn't go over to file size, as this will add empty memory to the file and can corrupt the file.

We then return an object with the provided information of the file, as this is important when creating the file once uploaded to the server.

Uploading

So, with all that, this is what our code looks like:

let inputFiles = document.getElementById("userFiles");
let files = [];


inputFiles.addEventListener("change", (event) => {
    fileList = event.target.files;
    for (let i = 0; i < fileList.length; i++) {
        files.push(fileList[i]);
    }

    for (let i = 0; i < files.length; i++) {
        // 2MB as we are dealing with small videos        
        if(files[i].size >= 2097152)
        {
            let newFile = chunkFile(files[i]);
            files[i] = newFile;
        }
    }
})

function chunkFile(file)
{
    let mediaChunks = [];
    let start = 0;
    let chunkSize = 2097152;
    let end = file.size;
    let id = 0;
    while(start < end)
     {
          id += 1;
          let chunkEnd = start + chunkSize;
          if(chunkEnd > end)
          {
              chunkEnd = end;
          }
          let chunk = new File([file.slice(start,(chunkEnd))],file.name
          ,{type: file.type});
          mediaChunks.push(chunk);
          start = start + chunkSize;
     }
    return {mediaChunks: mediaChunks, type: file.type, name: file.name,
    numberOfChunks: id, size: file.size}
}

Enter fullscreen mode Exit fullscreen mode

Our next problem is how to upload each file. Uploading the files that are below our limited size is as simple as the above example, with a few tweaks:

const submit = document.getElementById("formSubmit");
const userName = document.getElementById("userName");
const userEmail = document.getElementById("userEmail");

submit.addEventListener("click",  async (event) => {
    event.preventDefault();
    let uploadUserName = userName.value;
    let uploadUserEmail = userEmail.value;
    for (let i = 0; i < files.length; i++) {
        await uploadFiles(uploadUserName, uploadUserEmail, files[i]);
    }
})

async function uploadFiles(uploadUserName, uploadUserEmail, file)
{
    console.log(uploadUserName, uploadUserEmail, file);
    let formData = new FormData();
    formData.append("userName",uploadUserName);
    formData.append("userEmail",uploadUserEmail);
    formData.append("file",file);
    try {
        let res = await fetch('/server',{method:"POST", body:formData,
        headers: { /* Add Headers */ }});
        if(res.ok)
        {
            // Inform user of successful upload
        }
        else
        {
            throw new Error("Form upload failed");
        }
    } catch (error) {
        console.error(error);
    }
} 

Enter fullscreen mode Exit fullscreen mode

This would upload each file with the other form information, and we would have a nice working upload, but we are dealing with large files. We will add to this to allow for a larger file, as we need to loop through each chunk connected to that file.

As we are dealing with small and large files, we will create two upload functions, one we already have and another to handle larger files. When the user submits, we will have a for loop that checks each file if it is under 2MB, it will use the first upload function, else it will use the second upload function.

for (let i = 0; i < files.length; i++) {
     if(files[i].size <= 2097152)
     {
        await uploadFiles(uploadUserName, uploadUserEmail, files[i]);
     }
     else
     {
        await largeUploadFiles(uploadUserName, uploadUserEmail, files[i]);
     }
}
Enter fullscreen mode Exit fullscreen mode

When we get a large file, we have to treat it differently from the smaller files, as we don't just have one file, we have an array of files. So in this basic example, we will just loop through the chunks and send them to the server. Before we do that, we will inform the server that we have started uploading with information on the files being uploaded. We will then loop over the chunks, sending them one by one, making sure to identify them with IDs and their order so that the server can place them in the right order to re-create the file. Once all chunks are sent, make a final call to the server to confirm that it has all the files and it can start its process.

async function largeUploadFiles(uploadUserName, uploadUserEmail, file)
{
    try {
        let fileInfo = new FormData();
        fileInfo.append("userName",uploadUserName);
        fileInfo.append("userEmail",uploadUserEmail);
        fileInfo.append("name",file.name);
        let res = await fetch('/server-large-setUp',{method:"POST",
        body:fileInfo, headers: { /* Add Headers */ }});
        if(res.ok)
        {
            // Inform user of successful upload
        }
        else
        {
            throw new Error("Form upload failed");
        }
    } catch (error) {
        console.error(error);
    }
    for (const [index,chunk] of file.mediaChunks.entries()) 
    {


            let formData = new FormData();
            formData.append("name",file.name);
            formData.append("file",chunk);
            formData.append("id",index + 1);
            try {
                let res = await fetch('/server-large',{method:"POST",
                body:formData, headers: { /* Add Headers */ }});
                if(res.ok)
                {
                    // Inform user of successful upload
                }
                else
                {
                    throw new Error("Form upload failed");
                }
            } catch (error) {
                console.error(error);
            }
    }
    try {
            let chunkInfo = new FormData();
            chunkInfo.append("chunks",file.numberOfChunks);
            chunkInfo.append("name",file.name);
            let res = await fetch('/server-large-finished',{method:"POST",
            body:fileInfo, headers: { /* Add Headers */ }});
            if(res.ok)
            {
                    // Inform user of successful upload
            }
            else
            {
                throw new Error("Form upload failed");
            }
    } catch (error) {
                console.error(error)
    }
}
Enter fullscreen mode Exit fullscreen mode

Conclusion

That's it, you created an upload form that can handle large files. But you don't have to keep it that simple, you can add a backoff upload where each time you fail an upload, you try again, but with an increased pause to allow the server to deal with its blocking task, and after 3 attempts, throw the user an error. Maybe even if uploading chunks one by one is not fast enough, you can create a system that uploads the chunks in groups, as we have them ordered, and the server can handle the ordering.

This was just a basic example to help you get your feet wet in the world of larger files. I might make a follow-up blog about handling large files in Node.js, on how to receive incoming chunks, and recreating the file or about streaming files from one server to another to be encoded in FFMPEG within a Docker container.

Top comments (0)