TL;DR
Ran into maximum payload size errors when uploading files to Google Drive after deploying my SvelteKit app. While you can increase the payload limit in your server config, a better solution is to upload files in chunks using Google Drive's resumable upload API. This guide walks through a complete implementation of chunked file uploads, from generating resumable links on the backend to splitting and uploading file chunks from the frontend.
Prerequisites
Before following this guide, you'll need:
- googleapis - The official Google APIs Node.js client
- GOOGLE_CLIENT_ID - Your OAuth 2.0 client ID from Google Cloud Console
- GOOGLE_CLIENT_SECRET - Your OAuth 2.0 client secret from Google Cloud Console
The Problem
I was building a feature for my side project and ran into an issue with maximum payload size. On my dev environment (local), I was able to upload files with no issues. But when I deployed to production, I got an error about the maximum payload size. Basically, the data I was sending from the frontend was bigger than what the server was configured to accept.
SvelteKitError: Content-length of 592917 exceeds limit of 524288 bytes.
Yes, it only accepts like 512KB... Now, this can be changed in the settings. I'm using SvelteKit, so the change is pretty straightforward:
// For node adapter
import adapter from '@sveltejs/adapter-node';
export default {
  kit: {
    adapter: adapter({
      // Set max body size (in bytes)
      bodySize: 10485760 // 10MB
    })
  }
};
// For vercel adapter
import adapter from '@sveltejs/adapter-vercel';
export default {
  kit: {
    adapter: adapter({
      maxDuration: 10,
    })
  }
};
But if you just crank up the max payload size, there are a few risks you need to be aware of. Your server could be abused, you could face memory issues, and you're opening yourself up to potential DoS attacks. Not ideal.
How I Got Here
I encountered this issue when I was trying to upload a PDF file to my own Google Drive for my app. I had already figured out the authentication part. I was getting the correct token and everything was working smoothly. But then, the size issue hit me in prod.
My solution for this is to upload the file in chunks instead of all at once. So naturally, I went looking for guides. But honestly? The ones I found were lacking. Even the official Google Drive API docs weren't as helpful as I'd hoped. I also found a library on GitHub, but I really don't want to add an extra dependency if not really necessary. There were some relevant threads on Stack Overflow, but nothing that really solved my exact problem:
- https://stackoverflow.com/questions/75565974/how-to-upload-a-file-in-chunks-to-google-drive
- https://stackoverflow.com/questions/69943889/how-to-upload-video-in-chunks-to-google-drive-using-vanilla-javascript
I gotta be honest, while looking back for the stackoverflow links that I found while trying to solve this issue, I come across this blog, which is probably closer to what I need lol. But oh well, i'm going to write this anyway.
What really surprised me was that the Google Drive JS library doesn't provide a straightforward way to upload files in chunks, at least at the time of writing. I mean, it's surprising... but also kind of not, if you know what I mean. So I decided to roll my own implementation, and I'm sharing it here in case it helps someone else who's stuck in the same situation.
Just so we are clear, this is not groundbreaking or new. Chunking has been solved. But there is no good guides on how to do it with google drive (at least at the time of writing this). So that is why I am making this.
The Implementation
Setting Up the Function
This is just an example function, but it accepts a FileList object. If you're only trying to upload a single file, you can just change the parameter type to File instead:
const processFile = async (_files: FileList) => {
   ...
}
In my case, I'm converting the FileList to an array so I can loop through multiple files. Now, I should mention that this is a bit different from my actual implementation. In my app, I'm handling uploads asynchronously so users don't have to wait for one file to finish before the next one starts. But for the sake of keeping this tutorial simple and easy to follow, I'm doing it sequentially:
const processFile = async (_files: FileList) => {
   const filesArray = Array.from(_files);
   filesArray.forEach(async (_file, index) => {
      // Processing logic goes here
   });
}
Getting the Resumable Link
The first step is to get what Google calls a "resumable link." This is a special URL that Google Drive gives you, which allows you to upload your file in chunks rather than all at once.
Here, I'm calling an endpoint on my own backend (you'll need to change this to match your setup). This endpoint will fetch the resumable link for us:
const processFile = async (_files: FileList) => {
   const filesArray = Array.from(_files);
   filesArray.forEach(async (_file, index) => {
      const response = await fetch('/api/google-drive/resumable-link', {
         method: 'POST',
         body: formData
      });
      const body = await response.json();
   });
}
Backend: Creating the Resumable Link
Let's take a quick peek at what's happening on the backend at /api/google-drive/resumable-link. Here's basically what you need:
// api/google-drive/resumable-link
const getGoogleDrive = () => {
   const oauth2Client = new google.auth.OAuth2(
      env.GOOGLE_CLIENT_ID,
      env.GOOGLE_CLIENT_SECRET,
      `${env.YOUR_GOOGLE_CALLBACK_URL}`
   );
   const accessToken = getYourUserAccessToken();
   oauth2Client.setCredentials({ access_token: accessToken });
   return {
      drive: google.drive({
         version: 'v3',
         auth: oauth2Client
      })
   };
}
// Get the authenticated drive object
const { drive } = await getGoogleDrive();
const folderName = "name_of_folder_where_you_want_to_put_the_file_in";
// If you want to upload to a specific folder, use this code
const checkFolderRes = await drive.files.list({
   q: `name='${folderName}' and mimeType='application/vnd.google-apps.folder' and trashed=false`,
   fields: 'files(id, name)'
});
let folderId = null;
// Check if the folder exists. If not, create it
if (checkFolderRes.data.files?.length == 0) {
   const createFolderRes = await drive.files.create({
      requestBody: {
         name: folderName,
         mimeType: 'application/vnd.google-apps.folder'
      },
      fields: 'id'
   });
   folderId = createFolderRes.data.id;
} else {
   folderId = checkFolderRes.data.files[0].id;
}
// Optional: You can check if the file already exists
// so you don't end up with duplicates
const checkFileRes = await drive.files.list({
   q: `'${folderId}' in parents and name='${fileName}' and mimeType='${fileType}' and trashed=false`,
   fields: 'files(id, name)'
});
if (checkFileRes.data.files?.length !== 0) {
   return createSuccessResponse({
      error_msg: "FILE_ALREADY_EXIST"
   });
}
const encoder = new TextEncoder();
const encoded = encoder.encode(JSON.stringify({
   name: fileName,
   parents: [folderId]
}));
// IMPORTANT: Pay close attention to the uploadType parameter here
const res = await fetch('https://www.googleapis.com/upload/drive/v3/files?uploadType=resumable', {
   method: 'POST',
   headers: {
      'Authorization': `Bearer ${accessToken}`,
      'Content-Type': "application/json; charset=UTF-8",
      'Content-Length': String(encoded.length)
   },
   body: JSON.stringify({
      name: fileName,
      parents: [folderId]
   })
});
// The upload URL will be in the Location header
const resumableLink = res.headers.get('Location');
// Store your resumable link either in your database or cache. 
// For this example, I'm storing it in Redis. 
// I would NOT recommend returning the link directly to the client-side.
await redis.set("YOUR_KEY", resumableLink);
return {
   msg: "success"
};
Frontend: Chunking and Uploading
Now let's hop back to the client-side code. This is where we actually split the file into chunks and send them to our backend:
const processFile = async (_files: FileList) => {
   const filesArray = Array.from(_files);
   filesArray.forEach(async (_file, index) => {
      const response = await fetch('/api/google-drive/resumable-link', {
         method: 'POST',
         body: formData
      });
      const body = await response.json();
      // Handle if response is an error...
      // This chunk size is arbitrary and depends on your server's max payload size.
      // Pay attention to other metadata you're also sending to your server.
      // If your chunk is as big as the max payload size but you also send 
      // other data alongside it, you'll still hit the payload size error.
      const CHUNK_SIZE_IN_KB = 256 * 1024;
      const totalChunks = Math.ceil(_file.size / CHUNK_SIZE_IN_KB);
      for (let chunkIndex = 0; chunkIndex < totalChunks; chunkIndex++) {
         const start = chunkIndex * CHUNK_SIZE_IN_KB;
         const end = Math.min(start + CHUNK_SIZE_IN_KB, _file.size);
         // Slice the file based on your chunk size
         const chunkBlob = _file.slice(start, end);
         const formData = new FormData();
         formData.append('chunk', chunkBlob);
         formData.append('startByte', start.toString());
         formData.append('endByte', end.toString());
         // This is the TOTAL size of your file, not the size of the chunk
         formData.append('fileSize', _file.size.toString());
         // You probably need some sort of ID to know which resumable link
         // you should use to upload this chunk to
         formData.append('id', SOME_SORT_OF_ID);
         // Send your chunk to your backend
         const uploadResponse = await fetch('/api/google-drive/upload', {
            method: 'POST',
            body: formData
         });
      }
   });
}
Backend: Handling the Chunks
The backend endpoint that receives each chunk will look something like this:
const chunk = formData.get('chunk');
const id = formData.get('id');
const startByte = formData.get('startByte');
const endByte = formData.get('endByte');
const fileSize = formData.get('fileSize');
const contentRange = `bytes ${startByte}-${endByte - 1}/${fileSize}`;
const chunkArrayBuffer = await chunk.arrayBuffer();
const accessToken = getYourUserAccessToken();
const uploadUrl = await redis.get(REDIS_KEY_YOU_STORE_FOR_THE_USER_RESUMABLE_LINK);
const res = await fetch(uploadUrl, {
   method: 'PUT',
   headers: {
      "Content-Range": contentRange,
      'Content-Length': chunkArrayBuffer.byteLength.toString(),
      'Authorization': `Bearer ${accessToken}`,
   },
   body: chunkArrayBuffer
});
// If the response is 200 or 201, that means all chunks have been received
// and the file upload is complete
if (res.status === 200 || res.status === 201) {
   const data = await res.json();
   const { id } = data;
   // Store the Google Drive file ID in your database
   // so you can reference this file later
   return createSuccessResponse({
      success: true,
      complete: true,
   });
}
// If the status is 308, there are still more chunks to be sent
if (res.status === 308) {
   const range = res.headers.get('range');
   return createSuccessResponse({
      success: true,
      complete: false,
      uploaded: range,
   });
}
That's It!
And there you have it—a complete implementation for uploading large files to Google Drive in chunks. This approach lets you handle files of any size without worrying about payload limits, and it's way more robust than just cranking up your server's max payload size.
Also, if you want to implement some sort of progress bar, you can! Just use the loop as your progress. If you have 4 chunks in total that needs to be uploaded, for every time the loop reach the end, that is 25% of progress. You can figure out the code yourself for that one.
I hope this guide helps you out if you're dealing with the same problem. Feel free to adapt it to your own needs!
Btw, this is for my SaaS called EagleCite. It's a modern reference manager, supposed to be a Zotero alternative. I am a researcher, and sometimes I find it hard to find my annotations, and Zotero doesn't let you search for it. At least not the way that I need. Check it out or tell your academic friends, it would mean a lot :)
 
 
              

 
    
Top comments (0)