DEV Community

Alex Spieslechner
Alex Spieslechner

Posted on

Upload to AWS S3 directly from the browser (js aws sdk v3)

Stack

  • Server: ExpressJS (typescript)
  • Client: JavaScript
  • Cloud Provider: Amazon AWS

Preface

Traditionally, when uploading files in a single page app, you'd upload and store files directly on your backend. With cloud provider services files can be stored on cheaper, and faster file storage solutions like AWS S3.

But uploading files to our backend, just so that it can send them off to cloud storage isn't very efficient. How about allowing our frontend users to directly upload to our cloud storage instead?

You'll probably scream "BUT SECURITY!" right now, but there's a pattern called the "valet key pattern", which makes this not only secure but also extremely efficient.

Our example will use Amazon AWS S3's "presigned post url" feature, but the other cloud providers have identical features. eg. Azure Blob Storage has "shared access signatures".

The Concept

Instead of uploading our files to our backend, our frontend only sends a message to our backend saying something like "hey i'd like to upload a file with this metadata, is that okay?". Our backend can then perform checks on the metadata (like max filesize, file type, etc). If everything is ok, it will generate some short lived credentials that act as a one-time-use token, the frontend can then use to upload the file itself directly to the s3 bucket.

The important detail here is that in the message to our backend, we do NOT send the full file. We only send some basic metadata like media type and size. Our api is not interested in the (heavy) file contents.

architecture

AWS S3: Setup

Bucket and API User creation

Head to aws and create a new S3 bucket, if you dont have one yet. In the article I'll name our bucket "example-bucket".

We need a user for our backend that gives us programmatic access to our S3 bucket. Create one in the IAM (User management) service. I name mine "example-serviceprincipal", and give it the "programmatic access" type, and the permissions "AmazonS3FullAccess".

Make sure to copy the access key ID, and secret access key! to some safe place. We'll need those later.

CORS

Back on the S3 service open the Permissions tab. We need to allow our frontend to access the bucket. In the CORS section allow GET, POST and PUT requests from our frontend url localhost:8080.

We'll also expose the Location header (make it accessible for the browser), because we'll need this one later.



[
  {
    "AllowedHeaders": ["*"],
    "AllowedMethods": ["GET", "POST", "PUT"],
    "AllowedOrigins": ["http://localhost:8080"],
    "ExposeHeaders": ["Location"]
  }
]


Enter fullscreen mode Exit fullscreen mode

Bucket Policy

To be explicit, create a subfolder called "public" in our bucket. This is where our uploaded files will go and we'll make this folder publicly accessible.



{
  "Version": "2021-07-04",
  "Id": "Policy1625335161483",
  "Statement": [
    {
      "Sid": "AllowPublicRead",
      "Effect": "Allow",
      "Principal": "*",
      "Action": "s3:GetObject",
      "Resource": "arn:aws:s3:::example-bucket/public/*"
    }
  ]
}


Enter fullscreen mode Exit fullscreen mode

Bucket Settings

AWS has some additional security measures to make sure stuff doesnt go public accidentially. For our example I've allowed public access, but blocked any accidential public access through new ACLs.

aws-settings

Just make sure this setting doesnt disable public access, no matter what we configure.

Backend: Generate a presigned URL

To keep this article short, lets assume we have a basic express api that looks something like this



// server/src/index.ts
import express from "express";
import cors from "cors";

const app = express();
app.use(express.json());
app.use(cors());

app.listen(3000, () => {
  console.log(`The Server is running on http://localhost:${port}`);
});


Enter fullscreen mode Exit fullscreen mode

AWS provides an awesome javascript sdk v3, that allows us to interact with AWS resources. You can install the packages with npm.



npm install @aws-sdk/client-s3 @aws-sdk/s3-presigned-post


Enter fullscreen mode Exit fullscreen mode

Look up the access key id and secret we got from the AWS IAM user creation, and put them in our .env file.



# server/.env
AWS_ACCESS_KEY_ID=<VALUE>
AWS_SECRET_ACCESS_KEY=<VALUE>


Enter fullscreen mode Exit fullscreen mode

To interact with the s3 bucket we need to create a client.



// server/src/index.ts
import { S3Client } from "@aws-sdk/client-s3";

const s3Client = new S3Client({ region: "eu-central-1" });
// ...


Enter fullscreen mode Exit fullscreen mode

Make sure you add the correct region for your bucket. Note that we dont need to pull our environment variables in here. The AWS sdk picks those up automatically, if we name them correctly.

We expect our frontend to send a post request to our api, to generate the presigned url. In this request we'll get the type of file they want to upload, perform some checks like limiting which file types are allowed, and return the presigned post url data.



// server/src/index.ts
// ...
app.post("/", async function (req, res) {
  try {
    const type = req.body.type;
    if (!type) {
      return res.status(400).json("invalid request body");
    }
    const data = await generateUploadUrl({ type });
    return res.json(data);
  } catch (e) {
    return res.status(500).json(e.message);
  }
});


Enter fullscreen mode Exit fullscreen mode

Lets implement the generateUploadUrl function. To generate the presigned url we have to send a createPresignedPost command to our s3 client.



// server/src/index.ts
// ...
async function generateUploadUrl({ type }: { type: string }) {
  /**
   * We generate a new uuid as name, to prevent conflicting filenames.
   * You can install this package with `npm i uuid`
   */
  const name = uuid();
  const expiresInMinutes = 1;
  return await createPresignedPost(s3Client, {
    Bucket: "example-bucket",
    Key: `public/${name}`,
    Expires: expiresInMinutes * 60, // the url will only be valid for 1 minute
    Conditions: [["eq", "$Content-Type", type]],
  });
}


Enter fullscreen mode Exit fullscreen mode

The conditions, combined with all other details form a contract that limits what the user can actually upload with this generated url. If the conditions dont match, the upload will be blocked. So you could add filesize, and other checks in here as well.

To test our api and send a request, I use the awesome vscode plugin thunderclient.io. As you can see, we only send type as information. Not the whole file.



/**
 * POST localhost:3000/upload
 * { "type": "image/png" }
 */
{
  "url": "https://s3.eu-central-1.amazonaws.com/example-bucket",
  "fields": {
    "bucket": "example-bucket",
    "X-Amz-Algorithm": "AWS4-HMAC-SHA256",
    "X-Amz-Credential": "SOMETHINGSOMETHING/123456/eu-central-1/s3/aws4_request",
    "X-Amz-Date": "20210704T104027Z",
    "key": "public/SOME-GUID-KEY",
    "Policy": "AREALLYLONGPOLICYSTRING",
    "X-Amz-Signature": "SIGNATURESTRING"
  }
}


Enter fullscreen mode Exit fullscreen mode

Success! This request now gives us a response with all the required credentials for our presigned post url 😎

Upload a file using the presigned post URL in the browser

To keep it simple we'll be using vanilla javascript and html here, but you can use this method in any framework of course.

For a quick static server I'll use npx serve -p 8080 in the client folder.

Our form will allow the user to select a file and submit it for upload.



<!-- client/index.html -->
<form enctype="multipart/form-data" id="uploadForm">
  <label for="file">File:</label>
  <input type="file" name="file" required /> <br />
  <input type="submit" name="submit" value="Upload to Amazon S3" />
</form>

<script src="index.js"></script>


Enter fullscreen mode Exit fullscreen mode

In our index.js file on the client, we can now make the api call from above via the browser.



// client/index.js
uploadForm.addEventListener("submit", async function (event) {
  event.preventDefault();
  const file = event.target.elements.file.files[0];
  const presignedPost = await requestPresignedPost(file);
  console.log(presignedPost);
});

async function requestPresignedPost(file) {
  const { type } = file;
  const res = await window.fetch("http://localhost:3000/upload", {
    method: "POST",
    headers: {
      "Content-Type": "application/json",
    },
    body: JSON.stringify({
      type,
    }),
  });
  return res.json();
}


Enter fullscreen mode Exit fullscreen mode

This should now give us the same response we got when using thunderclient.

| If you run into CORS issues, make sure you didnt skip the AWS S3 setup section.

All thats left to do now is upload our file directly to our s3 bucket, with the received credentials. We can do this right after getting the presigned url, so the user doesn't even realize that we send two requests to different services.

The s3 endpoint expects a form upload where, additionally to the file itself, all the credentials from the presigned post url are appended as fields.



// client/index.js
uploadForm.addEventListener("submit", async function (event) {
  // ...add the two lines below
  const uploadedFileUrl = await uploadFile(file, presignedPost);
  console.log(uploadedFileUrl);
});

async function uploadFile(file, presignedPost) {
  const formData = new FormData();
  formData.append("Content-Type", file.type);
  Object.entries(presignedPost.fields).forEach(([key, value]) => {
    formData.append(key, value);
  });
  formData.append("file", file);

  const res = await window.fetch(presignedPost.url, {
    method: "POST",
    body: formData,
  });

  const location = res.headers.get("Location"); // get the final url of our uploaded image
  return decodeURIComponent(location);
}


Enter fullscreen mode Exit fullscreen mode

When we now upload an image through the browser and check the console we should get a url to our successfully uploaded image.

https://s3.eu-central-1.amazonaws.com/example-bucket/public/some-guid

Open the url in your browser and you should see your image displayed.

I hope this rundown was clear, but if you have any questions, feel free to comment and message me and I'll try to help you out.

Top comments (1)

Collapse
 
jeanleeroy profile image
JeanleeRoy

Great post. I was looking for this approach to upload files on different sources but nobody explained it as well as you did. Thank you :D