DEV Community

Cover image for Uploading user images to Google Cloud Storage
Ryan Bethel
Ryan Bethel

Posted on • Edited on • Originally published at ryanbethel.org

Uploading user images to Google Cloud Storage

Using signed URLs with Google Cloud Storage to simplify storing user assets.

There are many ways to store user images from a website or web app. Each has its own pain points. I am building a react web app with GatsbyJS connected to a graphQL backend API. It requires users to upload images of equipment for a personal inventory system. Storing images in a database is not ideal and uploading images through graphQL is problematic. I will show how to use signed URLs to allow users to upload images directly to Google Cloud Storage without having to proxy them through the backend server.

TL;DR

The solution is to request a signed URL from Google with your backend server and pass that signed URL to your frontend. The URL is requested as soon as a user opens the form to enter a new item in the database. When the user submits the form the image they selected is uploaded directly to Google Cloud Storage using the signed URL and the graphQL mutation is sent to the backend to update the database. This allows your frontend to upload directly to Google Cloud Storage securely without having to authenticate. Since the image is much larger than the other data going to the database most of the bandwidth of the transaction never hits your backend.

Google Cloud Storage Signed URLs

Every cloud provider has their own version of cloud storage. On top of the major players, there are dozens of saas services that add a convenient layer of on top for image upload, transform, and CDN. I chose Google because they are relatively cheap, with good documentation, and has many other supporting services that can be combined with the storage. You can get images into GCS through their CLI, web console, or by having users authenticate to google with the correct permissions. Each of these adds friction for the user or requires you to proxy images through the backend to the cloud storage bucket.

Google also provides the option for signed URLs that allow anyone with the URL to upload a file to a cloud storage bucket for a specified period of time. There are two ways to generate these signed URLs. You can generate them yourself in your code using a Key connected to your GCS account. The second method used here is to request the signed URL from the GCS storage API.

const { Storage } = require('@google-cloud/storage');

async function gcsSignedUrl(bucketName, filename, minutesToExpiration) {
  const storage = new Storage();

  const options = {
    version: 'v4',
    action: 'write',
    expires: Date.now() + minutesToExpiration * 60 * 1000,
  };

  const [url] = await storage
    .bucket(bucketName)
    .file(filename)
    .getSignedUrl(options);

  return url;
}

The URL returned from this function is signed using a key from your Google account so it requires no extra authentication to upload to the storage bucket. The filename specified is included in the URL. This name does not have to correspond to the name of the file chosen by the user on their local system. It specifies the name of the file that will be stored in the bucket. It is important to understand that within the expiration time multiple uploads could be made with this URL overwriting each other.

Naming the file with UUID

In this scheme we want to ensure that there are no naming collisions with other files. Our graphQL backend creates a UUID to be used as a name for each file. This guarantees each file will have a unique name stored in the database to link to the data for that image.

Passing the Signed URL to the frontend

The frontend uses a form to collect the data to store in the graphQL backend database. As soon as the form is opened on the front end the page sends a query to the backend requesting a signed URL. As an example my query looks like this:

const SIGNED_URL_QUERY = gql`
  query {
    signedurl {
      filename
      url
    }
  }
`;

Coordinating the image upload and database update

As soon as the add item page is opened the query above is trigged to return the URL and filename. A form is used to collect the data stored as values. A separate input with type=file to select the image to upload. The selected file is stored in react local state as selectedFile. When submission is triggered it sends the graphQL mutation addItemMutation with the form values and signedUrl.filename to the backend. Then the uploadHandler is triggered issuing a PUT request to Google Cloud Storage sending the selectedFile using the signedUrl.url.

const [addItemMutation, { data, error, loading }] = useMutation(ADD_ITEM_MUTATION);
const [selectedFile, setSelectedFile] = useState(null);

const onHandleChange = e => {
  setSelectedFile(e.target.files[0]);
  return false;
};

const uploadHandler = async url => {
  try {
    const response = await fetch(url, {
      method: 'PUT',
      body: selectedFile,
    });
  } catch (error) {
    console.error('Error:', error);
  }
};

const onFormSubmit = async values => {
   // simplified for clarity
   // 1) send data to backend (addItemMutation(values, signedurl.filename) )
   // 2) upload file to google (uploadHandler(signedurl.url))
   // 3) handle errors
   // 4) on success close form modal
}

<form onSubmit={onFormSubmit}>
  // form data = values
</form>

<div>
  <label htmlFor="inventoryPicture">Choose file to upload</label>
  <input type="file" name="inventoryPicture" accept=".jpg" onChange={onHandleChange} />
</div>;

Once executed the file is in cloud storage and the UUID name of the file is in the database to reference.

Using the images

Once uploaded the images can be accessed in many different ways. Here are just a couple options depending on your use case:

  1. Public URL - The cloud storage bucket can be configured so that the images are accessible through a URL with no authentication. They could be referenced by your site or web app or by anyone.
  2. Authenticated access through signed URL - If your app requires the images to be available only to authenticated users you can use a similar signed URL scheme to provide references to the images that expire when the user session expires.
  3. Cloud Function processing - In my case I want to perform some processing to images once they are uploaded before the frontend references them. I have configured a Google Cloud Function that is triggered when an image is uploaded to the bucket. This function creates a whole set of transformed size images for use as a srcset for responsive images. This takes an uploaded image that can be 3-5mb and creates 5 sizes from a thumbnail all the way up to a full-screen image. Then when my frontend references those images it can reference the appropriate size (ie 960w/filename.jpg). This cloud function also uses Google's Cloud Vision API to check for and flag inappropriate images are not uploaded.

Don't forget CORS

If it doesn't work at first it may be a CORS problem. You will need to configure your storage bucket to receive cross-origin requests from the frontend. CORS can be set for your cloud bucket either with the google console or from the command line. Here is a sample configuration to set CORS for your bucket.

gsutil cors set cors-json-file.json gs://example-bucket
[
    {
      "origin": ["https:/your-frontend.com"],
      "responseHeader": ["Content-Type"],
      "method": ["PUT"],
      "maxAgeSeconds": 3600
    }
]

Top comments (0)