DEV Community

Cover image for Master Secure File Uploads to AWS S3 in Node.js with Express and Multer

Master Secure File Uploads to AWS S3 in Node.js with Express and Multer

Uploading files securely is a key part of protecting user data in any web application. In this blog, we'll walk you through creating a secure file upload system using Node.js, Express, Multer, and AWS S3. We'll focus on best practices to ensure your uploads are safe and your application stays secure. This guide is perfect for beginners who want to learn how to handle file uploads while keeping security a top priority.

Now, let's dive in

Prerequisites

  • Basic knowledge of Node.js/Javascript
  • Basic knowledge of AWS and CloudFormation
  • AWS Account. Create one here for free
  • Patience

Step 1: Environment Set Up

1. Install Node.js and npm:

  • Download and install Node.js from nodejs.org.
  • Verify the installation:
$ node -v
#v20.18.1
$ npm -v
#10.8.2
Enter fullscreen mode Exit fullscreen mode

2. Install AWS CLI:

  • Download and install the AWS CLI from aws.amazon.com/cli.
  • Configure the AWS CLI with your credentials:
$ aws configure
#AWS Access Key ID [****************H53C]: 
#AWS Secret Access Key [****************1CcY]: 
#Default region name [us-east-1]: 
#Default output format [json]:
Enter fullscreen mode Exit fullscreen mode

Step 2: Create the Node.js Application

1. Initialize a new Node.js project:

$ mkdir node-file-upload-api
$ cd node-file-upload-api
$ npm init -y
Enter fullscreen mode Exit fullscreen mode

2. Install necessary packages:

$ npm install express aws-sdk multer dotenv uuid
Enter fullscreen mode Exit fullscreen mode

3. Create the application file (app.js):

  • Create a app.js file
  • Add the following javascript code
  • Here we are creating a function to send a POST request to upload our file and generate a presignedURL for the uploaded file.
// app.js
// Import the required packages
const express = require('express');
const multer = require('multer');
const AWS = require('aws-sdk');
const { v4: uuidv4 } = require('uuid');

// Load environment variables from a .env file
require('dotenv').config();

// Create an Express application
const app = express();

// Create an S3 instance with the specified region
const s3 = new AWS.S3({ region: process.env.AWS_REGION });

// Configure Multer to save uploaded files to the 'uploads/' directory
const upload = multer({ dest: 'uploads/' });

app.post('/upload', upload.single('file'), (req, res) => {
    // Get the uploaded file from the request
    const file = req.file;
    const fileKey = `${uuidv4()}-${file.originalname}`;

    const s3Params = {
        Bucket: process.env.S3_BUCKET,
        Key: fileKey,
        Body: require('fs').createReadStream(file.path), // Create a readable stream from the uploaded file
        ContentType: file.mimetype
    };

    s3.upload(s3Params, (err, data) => {
        if (err) {
            return res.status(500).send(err);
        }

        // Generate presigned URL
        const presignedUrlParams = {
            Bucket: process.env.S3_BUCKET,
            Key: fileKey,
            Expires: 60 * 60 // URL expiration time in seconds (e.g., 1 hour)
        };

        s3.getSignedUrl('getObject', presignedUrlParams, (err, presignedUrl) => {
            if (err) {
                return res.status(500).send(err);
            }

            res.status(200).send({
                message: 'File uploaded successfully',
                url: data.Location,
                presignedUrl: presignedUrl
            });
        });
    });
});

// Start the server on port 3000
app.listen(3000, () => {
    console.log('Server running on port 3000');
});
Enter fullscreen mode Exit fullscreen mode

4. Create a .env file for environment variables:

  • Create a .env file
  • Add the following information
AWS_REGION=us-east-1
S3_BUCKET=my-node-app-bucket-382c-c803-a96a-49f1
Enter fullscreen mode Exit fullscreen mode
  • Make sure that your bucket name is unique. You can use the Online UUID Generator to generate a unique UUID to append to the name.

Step 3: Create the CloudFormation Template

1. Create a file named s3-file-upload.yml and add the following content

  • Create a s3-file-upload.yml file
  • Add the following code
AWSTemplateFormatVersion: "2010-09-09"

Resources:
  S3Bucket:
    Type: "AWS::S3::Bucket" # Defines an S3 bucket resource
    Properties:
      BucketName: !Sub "my-node-app-bucket-382c-c803-a96a-49f1" # Sets the bucket name
      PublicAccessBlockConfiguration: # Configures public access settings for the bucket
        BlockPublicAcls: true # Blocks public ACLs
        BlockPublicPolicy: false # Allows custom bucket policies
        IgnorePublicAcls: true # Ignores public ACLs
        RestrictPublicBuckets: true # Restricts public bucket access
      VersioningConfiguration:
        Status: Enabled # Enables versioning for the bucket

  S3BucketPolicy:
    Type: "AWS::S3::BucketPolicy" # Defines a bucket policy resource
    Properties:
      Bucket: !Ref S3Bucket # References the S3 bucket created above
      PolicyDocument:
        Version: "2012-10-17" # Specifies the version of the policy language
        Statement:
          - Effect: Allow # Allows the specified actions
            Principal: "*" # Applies to all principals (users)
            Action: "s3:GetObject" # Allows the GetObject action
            Resource: !Sub "${S3Bucket.Arn}/*" # Applies to all objects in the bucket
            Condition:
              Bool:
                aws:SecureTransport: "true" # Requires requests to use HTTPS
          - Effect: Allow # Allows the specified actions
            Principal:
              AWS: !Sub "arn:aws:iam::${AWS::AccountId}:role/MyAppRole" # Allows the IAM role to access the bucket
            Action: "s3:PutObject" # Allows the PutObject action
            Resource: !Sub "${S3Bucket.Arn}/*" # Applies to all objects in the bucket

  MyAppIAMRole:
    Type: AWS::IAM::Role # Defines an IAM Role resource
    Properties:
      RoleName: MyAppRole # Assigns a name to the IAM role
      AssumeRolePolicyDocument:
        Version: "2012-10-17" # Specifies the version of the policy language
        Statement:
          - Effect: Allow # Allows the following action
            Principal:
              Service: "ec2.amazonaws.com" # Specifies that the EC2 service can assume this role
            Action: "sts:AssumeRole" # Allows EC2 instances to assume the role
      Policies:
        - PolicyName: S3AccessPolicy # Names the inline policy for the role
          PolicyDocument:
            Version: "2012-10-17" # Specifies the version of the policy language
            Statement:
              - Effect: Allow # Allows the following actions
                Action: "s3:PutObject" # Grants permission to upload objects to S3
                Resource: !Sub "${S3Bucket.Arn}/*" # Applies to all objects in the specified S3 bucket
Enter fullscreen mode Exit fullscreen mode
  • In the above CloudFormation template, we are creating an S3 bucket with a name that includes the AWS account ID to make it unique. It blocks all public access and enables versioning. The template also adds a policy allowing anyone to download objects from the bucket, but only over HTTPS for secure data transmission.

Step 4: Deploy AWS Resources with CloudFormation

1. Deploy the stack:

  • We will now deploy our AWS CloudFormation stack using the template file we created above.
$ aws cloudformation deploy --template-file s3-file-upload.yml --stack-name S3FileUploadStack --capabilities CAPABILITY_NAMED_IAM
#Waiting for changeset to be created..
#Waiting for stack create/update to complete
#Successfully created/updated stack - S3FileUploadStack
Enter fullscreen mode Exit fullscreen mode
  • Here's a breakdown of each part:

aws cloudformation deploy: This is the AWS CLI command to deploy a CloudFormation stack.
--template-file s3-file-upload.yml: Specifies the path to the CloudFormation template file (s3-file-upload.yml) that defines the resources and configurations.
--stack-name S3FileUploadStack: Sets the name of the CloudFormation stack to S3FileUploadStack.
--capabilities CAPABILITY_NAMED_IAM: Acknowledges that the stack requires IAM resources with custom names, allowing the deployment to create or modify IAM roles and policies.

2. Confirm S3 bucket creation:

  • After deployment, login into your AWS console and confirm that the S3 bucket has been provisioned from the CloudFormation stack.

Step 5: Run Your Node.js Application

  • Confirm you have set the environment variables in the .env file with the correct S3 bucket name.
  • Start the application:
$ node app.js
#Server running on port 3000
Enter fullscreen mode Exit fullscreen mode

Step 6: Upload a File

1. Use Postman or cURL to send a POST request to http://localhost:3000/upload with a file.

Using Postman

  • Create a new POST request.
  • Set the URL to http://localhost:3000/upload.
  • In the "Body" tab, choose "form-data", add a key file, and select a file to upload Image description

Using Curl

$ curl -F "file=@/tmp/files/test-upload.png" http://localhost:3000/upload
Enter fullscreen mode Exit fullscreen mode

2. Receive the file URL in the response, indicating successful upload. The response will be in the following format.

{
    "message": "File uploaded successfully",
    "url": "https://my-node-app-bucket-382c-c803-a96a-49f1.s3.amazonaws.com/87587526-e70b-451d-a405-ceab366016a8-test-upload.png",
    "presignedUrl": "https://my-node-app-bucket-382c-c803-a96a-49f1.s3.amazonaws.com/87587526-e70b-451d-a405-ceab366016a8-test-upload.png?AWSAccessKeyId=AKIATA6LX4GWXV6MH53C&Expires=1736370100&Signature=66dKUQRW%2F47La3MPPngITEhFE%2FA%3D"
}
Enter fullscreen mode Exit fullscreen mode

Conclusion

In conclusion, mastering secure file uploads to AWS S3 in Node.js with Express and Multer involves more than just basic functionality—it requires a strong focus on security. By integrating Multer for efficient file handling, leveraging AWS S3's robust security features like bucket policies and presigned URLs, and enforcing HTTPS connections, we ensure that uploaded files are securely managed and accessed. This approach not only protects sensitive data but also helps prevent unauthorized access and potential vulnerabilities, making it a reliable and secure solution for modern web applications.

Top comments (0)