DEV Community

Cover image for Scalable Resumable File Uploads to S3 with Node.js, Reactjs , AWS SDK & MySQL
Aamir saleem lone
Aamir saleem lone

Posted on

Scalable Resumable File Uploads to S3 with Node.js, Reactjs , AWS SDK & MySQL

Uploading large files reliably is tough — browser crashes, flaky internet, or sudden interruptions can ruin everything.

This tutorial shows how to implement resumable, chunked uploads to Amazon S3, using:

  • 🧠 multipart upload API from AWS SDK
  • 🪣 Node.js + Express + Multer
  • 💾 MySQL for upload session tracking
  • ✅ Clean chunk validation & completion logic

🔧 Tech Stack

  • Backend: Node.js + Express
  • Frontend: React.js
  • Database: MySQL (for session/chunk tracking)
  • Storage: AWS S3 (multipart upload)
  • Middleware: Multer

🗂️ Core Concepts

  • Init Upload: Creates multipart upload session and saves to DB
  • Upload Chunk: Sends chunk with part number to S3
  • Check Chunk: Avoid duplicate uploads
  • Complete Upload: Merges parts and cleans up
  • Fallback: Automatically retries on failures using resumable clients

📁 Folder Structure

/helpers/awsConfig.js
/config/db.js
/controllers/uploadController.js
Enter fullscreen mode Exit fullscreen mode

🚀 Upload Controller Code

const path = require('path');
const { S3Config } = require('../../helper/awsConfig');
const bucketName = 'docintel';
const mysqlLib = require('../../config/db');

const asyncHandler = (fn) => (req, res) => {
  Promise.resolve(fn(req, res)).catch((err) => {
    console.error(err);
    res.status(500).json({ error: err.message });
  });
};
Enter fullscreen mode Exit fullscreen mode

📂 Get Folder Name Based on Type

const getFolderName = async (type, method = null, authId) => {
  if (method == 'libraryType') {
    let [data] = await mysqlLib.executeQuery(
      'select folder_name from users where id = ?',
      [authId]
    );
    return `video/${data.folder_name}`;
  }

  const map = {
    image: 'dummy_images',
    video: 'dummy_videos',
    audio: 'dummy_audios',
    document: 'dummy_documents',
  };

  return map[type] || map.document;
};
Enter fullscreen mode Exit fullscreen mode

🔄 API Endpoints

1️⃣ Upload Single File to S3 (non-chunked)

exports.uploadFileToAws = asyncHandler(async (req, res) => {
  const { type, name } = req.body;
  const file = req.file;
  if (!file) throw new Error('No file uploaded');

  const ext = path.extname(file.originalname).toLowerCase();
  if (!/\.(jpeg|jpg|png|gif|svg|mp3|mp4|pdf)$/.test(ext)) {
    throw new Error('Unsupported file type');
  }

  const folderName = getFolderName(type);
  const fileName = `${path.parse(name).name}_${Date.now()}${ext}`.replace(/\s+/g, '_');
  const fullPath = `${folderName}/${fileName}`;

  const result = await S3Config.upload({
    Bucket: bucketName,
    Key: fullPath,
    Body: file.buffer,
    ACL: 'public-read',
  }).promise();

  res.status(200).json({ message: 'File uploaded successfully', data: result.Location });
});
Enter fullscreen mode Exit fullscreen mode

2️⃣ Initialize Multipart Upload

exports.initUpload = asyncHandler(async (req, res) => {
  const { authId, body } = req;
  const { filename, type, method } = body;
  if (!filename || !type) throw new Error('Missing filename or type');

  const folderName = await getFolderName(type, method, authId);
  const s3Key = `${folderName}/${Date.now()}_${filename}`;

  const { UploadId: uploadId } = await S3Config.createMultipartUpload({
    Bucket: bucketName,
    Key: s3Key,
    ACL: 'public-read',
  }).promise();

  await mysqlLib.executeQuery(
    'INSERT INTO upload_sessions (upload_id, s3_key, type) VALUES (?, ?, ?)',
    [uploadId, s3Key, type]
  );

  res.status(200).json({ uploadId, s3Key });
});
Enter fullscreen mode Exit fullscreen mode

3️⃣ Check if Chunk Exists

exports.checkUpload = asyncHandler(async (req, res) => {
  const { resumableChunkNumber, uploadId } = req.query;

  const [session] = await mysqlLib.executeQuery(
    'SELECT * FROM upload_sessions WHERE upload_id = ?',
    [uploadId]
  );
  if (!session) return res.status(404).send('Upload session not found');

  const [part] = await mysqlLib.executeQuery(
    'SELECT * FROM upload_parts WHERE upload_id = ? AND part_number = ?',
    [uploadId, resumableChunkNumber]
  );

  return part ? res.status(200).send('Chunk exists') : res.status(404).send('Chunk not found');
});
Enter fullscreen mode Exit fullscreen mode

4️⃣ Upload Chunk to S3

exports.uploadChunk = asyncHandler(async (req, res) => {
  const { resumableChunkNumber, uploadId } = req.body;

  const [session] = await mysqlLib.executeQuery(
    'SELECT * FROM upload_sessions WHERE upload_id = ?',
    [uploadId]
  );
  if (!session) throw new Error('Invalid upload session');

  const partNumber = parseInt(resumableChunkNumber);
  const result = await S3Config.uploadPart({
    Bucket: bucketName,
    Key: session.s3_key,
    PartNumber: partNumber,
    UploadId: uploadId,
    Body: req.file.buffer,
  }).promise();

  await mysqlLib.executeQuery(
    'INSERT INTO upload_parts (upload_id, part_number, etag) VALUES (?, ?, ?)',
    [uploadId, partNumber, result.ETag]
  );

  res.status(200).json({
    message: 'Chunk uploaded',
    ETag: result.ETag,
    PartNumber: partNumber,
  });
});
Enter fullscreen mode Exit fullscreen mode

5️⃣ Complete Multipart Upload

exports.completeUpload = asyncHandler(async (req, res) => {
  const { uploadId } = req.body;

  const [session] = await mysqlLib.executeQuery(
    'SELECT * FROM upload_sessions WHERE upload_id = ?',
    [uploadId]
  );
  if (!session) throw new Error('Invalid upload session');

  const parts = await mysqlLib.executeQuery(
    'SELECT part_number, etag FROM upload_parts WHERE upload_id = ? ORDER BY part_number ASC',
    [uploadId]
  );

  const sortedParts = parts.map(part => ({
    PartNumber: parseInt(part.part_number),
    ETag: part.etag
  }));

  const result = await S3Config.completeMultipartUpload({
    Bucket: bucketName,
    Key: session.s3_key,
    UploadId: uploadId,
    MultipartUpload: { Parts: sortedParts },
  }).promise();

  // Optional cleanup
  await mysqlLib.executeQuery('DELETE FROM upload_parts WHERE upload_id = ?', [uploadId]);
  await mysqlLib.executeQuery('DELETE FROM upload_sessions WHERE upload_id = ?', [uploadId]);

  res.status(200).json({
    message: 'Upload completed',
    url: decodeURIComponent(result.Location),
    key: result.Key,
  });
});
Enter fullscreen mode Exit fullscreen mode

✅ Database Tables (MySQL)

CREATE TABLE `upload_sessions` (
  `id` INT AUTO_INCREMENT PRIMARY KEY,
  `upload_id` VARCHAR(255),
  `s3_key` VARCHAR(512),
  `type` VARCHAR(50),
  `created_at` TIMESTAMP DEFAULT CURRENT_TIMESTAMP
);

CREATE TABLE `upload_parts` (
  `id` INT AUTO_INCREMENT PRIMARY KEY,
  `upload_id` VARCHAR(255),
  `part_number` INT,
  `etag` VARCHAR(255),
  `created_at` TIMESTAMP DEFAULT CURRENT_TIMESTAMP
);
Enter fullscreen mode Exit fullscreen mode

🧠 Final Thoughts

Why this works:

  • File uploads won't break on slow connections
  • You can resume failed uploads
  • Saves bandwidth by checking existing chunks
  • Scales beautifully with S3's multipart APIs

** Frontend Using React js
We’ll use:

  • Resumable.js for chunked uploads
  • React for UI
  • Axios for making API calls

* Dummy API endpoints to simulate a real-world system

🧩 What You'll Build

✅ File validation (size, type)
✅ Chunked upload via multipart upload
✅ Pause/Resume/Cancel support
✅ Real-time progress and speed
✅ Retry on failure
✅ Finalization after upload


📦 Dependencies

Install the following if you haven't:

npm install resumablejs axios react-bootstrap react-circular-progressbar react-toastify
Enter fullscreen mode Exit fullscreen mode

🛠 Backend API Expectations

Assume the following dummy endpoints (you can mock these for dev):

Endpoint Method Description
/api/init-upload POST Initialize multipart upload
/api/upload POST Upload a file chunk
/api/complete-upload POST Complete multipart upload
/api/check-chunk GET Check if a chunk exists (optional)

📁 Frontend Uploader Component

// ResumableUploader.jsx
import React, { useEffect, useRef, useState } from "react";
import Resumable from "resumablejs";
import axios from "axios";
import { Modal, Button } from "react-bootstrap";
import { CircularProgressbar } from "react-circular-progressbar";
import { toast } from "react-toastify";

const CHUNK_SIZE = 5 * 1024 * 1024; // 5MB
const ALLOWED_TYPES = ["application/pdf", "image/png", "image/jpeg", "video/mp4"];
const MAX_SIZE_MB = 1024;

const ResumableUploader = ({ label, onSuccess }) => {
  const resumableRef = useRef(null);
  const uploadId = useRef(null);
  const s3Key = useRef(null);
  const browseRef = useRef(null);
  const activeUpload = useRef(null);

  const [progress, setProgress] = useState(0);
  const [uploading, setUploading] = useState(false);
  const [status, setStatus] = useState("");

  const initUploadSession = async (file) => {
    try {
      const response = await axios.post("https://dummyapi.com/api/init-upload", {
        filename: file.name,
        type: file.type,
      });
      uploadId.current = response.data.uploadId;
      s3Key.current = response.data.s3Key;
      return true;
    } catch (err) {
      toast.error("Failed to initialize upload.");
      return false;
    }
  };

  const validateFile = (file) => {
    if (!ALLOWED_TYPES.includes(file.type)) {
      toast.error("Unsupported file type");
      return false;
    }
    if (file.size > MAX_SIZE_MB * 1024 * 1024) {
      toast.error("File too large");
      return false;
    }
    return true;
  };

  const completeUpload = async () => {
    try {
      const response = await axios.post("https://dummyapi.com/api/complete-upload", {
        uploadId: uploadId.current,
      });
      toast.success("Upload complete!");
      onSuccess?.(response.data.url);
    } catch (err) {
      toast.error("Upload finalization failed.");
    } finally {
      setUploading(false);
      setProgress(0);
      uploadId.current = null;
      s3Key.current = null;
      activeUpload.current = null;
    }
  };

  const handleFileAdded = async (file) => {
    if (!validateFile(file.file)) {
      resumableRef.current.removeFile(file);
      return;
    }

    setUploading(true);
    const initialized = await initUploadSession(file.file);
    if (!initialized) return;

    resumableRef.current.opts.query = { uploadId: uploadId.current };
    activeUpload.current = file;
    resumableRef.current.upload();
  };

  const handleProgress = (file) => {
    const percent = Math.floor(file.progress() * 100);
    setProgress(percent);
  };

  const handleSuccess = () => {
    completeUpload();
  };

  const handleError = (file, message) => {
    console.error("Upload error:", message);
    toast.error("Upload failed.");
    setUploading(false);
  };

  useEffect(() => {
    const r = new Resumable({
      target: "https://dummyapi.com/api/upload",
      query: {},
      fileParameterName: "file",
      chunkSize: CHUNK_SIZE,
      simultaneousUploads: 3,
      testChunks: false,
      maxFiles: 1,
      throttleProgressCallbacks: 1,
      headers: {
        auth: "Bearer token",
      },
      fileType: ALLOWED_TYPES,
    });

    r.assignBrowse(browseRef.current);
    r.on("fileAdded", handleFileAdded);
    r.on("fileProgress", handleProgress);
    r.on("fileSuccess", handleSuccess);
    r.on("fileError", handleError);

    resumableRef.current = r;

    return () => {
      r.cancel();
    };
  }, []);

  return (
    <>
      <Button ref={browseRef} disabled={uploading}>
        {uploading ? "Uploading..." : label}
      </Button>

      <Modal show={uploading} centered backdrop="static">
        <Modal.Header>
          <Modal.Title>Uploading</Modal.Title>
        </Modal.Header>
        <Modal.Body>
          <CircularProgressbar value={progress} text={`${progress}%`} />
          <p className="mt-2">{status}</p>
          <Button variant="danger" onClick={() => resumableRef.current.cancel()}>
            Cancel
          </Button>
        </Modal.Body>
      </Modal>
    </>
  );
};

export default ResumableUploader;
Enter fullscreen mode Exit fullscreen mode

🔧 Notes

  • Replace https://dummyapi.com/api/... with your actual backend endpoints.
  • Use onSuccess callback to handle uploaded file URLs (e.g., save to DB).
  • The resumablejs library manages chunking and retrying for you!

🧪 Example Usage

<ResumableUploader
  label="Upload Video"
  onSuccess={(url) => console.log("Uploaded to:", url)}
/>
Enter fullscreen mode Exit fullscreen mode

🧠 Wrap Up

This setup provides a robust resumable upload experience, with built-in retry, progress feedback, and finalization using AWS S3 multipart under the hood. It's production-ready and highly extensible.



Tags: react aws s3 upload resumablejs axios frontend

Top comments (0)