DEV Community

Cover image for Deploy React SPA to AWS S3 using Github Actions
Arsalan Ahmed Yaldram
Arsalan Ahmed Yaldram

Posted on • Updated on

Deploy React SPA to AWS S3 using Github Actions

Introduction

In this tutorial I will continue working on my react component library which we bootstrapped in the last project using vite. If you remember we also had setup storybook for testing our components. We will now deploy storybook to S3 using Github Actions. If you are new here, I would highly recommend you read my previous tutorial where we setup a react component library here. You can also follow this tutorial even if you are building a React SPA or using other SPA frameworks. This is not a beginner post, some familiarity with Github Actions, AWS S3 and AWS CLI is expected.

My Deployment Workflow

  • I have 2 branches master & dev. I create new feature, fix branches from the dev branch. The dev branch is where all the feature and fix branches are merged and tested.
  • I have created 2 buckets with public access and static web hosting enabled. One bucket will hold our production code from the master branch. This will be our public facing production site. We can setup Cloudfront for this bucket and a Domain Name using AWS Route53.
  • Second bucket will hold code from the dev branch. This will contain all the pre-release features and fixes which we will test before merging into master and shipping to our users.

  • Merging code into the dev branch will trigger a workflow that will publish the latest code to the dev bucket.

  • Once a number of feature and fix branches are merged and tested on the dev branch we create a Pull Request this time with master as the base branch. We can call this as our release pull request. Once we are done with tests and other sanity checks, we will finally merge the release pull request into the master.

  • Merging code into the master branch will trigger a workflow that will publish the latest code to the master bucket, our public facing site, meaning we will finally ship those features to our users.

  • We will create another workflow for Preview Deployments. For every new Pull Request, this workflow will create a S3 bucket, deploy our SPA and will add a comment on our Pull Request with the Preview Link.
    deploy-preview-comment

  • This workflow will run on each new push to our branch / Pull Request and create another Deploy Preview.

  • For the Deploy Previews we will create our S3 bucket on the fly using the AWS CLI, add necessary policies and permissions.

Step One: Creating a public S3 bucket.

  • First, we need to create 2 buckets, one will be the public facing bucket holding our production deployment, second will be our dev bucket.
  • For the sake of our demo, I created both the buckets in the same AWS account, you might create them in separate accounts or whatever best practice you follow. Don't forget to share your methods in the comments :).

  • Head on to the AWS Console and search for S3. On the S3 homepage click the Create bucket button.

  • Under the General configuration tab give the bucket name say - react-vite-lib-prod.

  • Under Block Public access setting uncheck the Block public access check box
    S3-block-public-access

  • Then scroll down to the bottom of the page and click on the Create bucket button.

  • Now from all buckets list, click on the link with our newly created bucket. Then head to the Permissions tab, there scroll down to the Bucket Policy section and click the Edit button and inside the editor paste the following and hit Save Changes -

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Sid": "PublicReadGetObject",
            "Effect": "Allow",
            "Principal": "*",
            "Action": "s3:GetObject",
            "Resource": "arn:aws:s3:::bucket-name/*"
        }
    ]
}
Enter fullscreen mode Exit fullscreen mode
  • Now click on the Properties Tab and scroll all the way down to the Static website hosting section and click on the Edit Button. Select the Enable option. Scroll down under the Index document type index.html. Leave the Error document because we are using a SPA, all 404 routes are handled by the client router. Finally click on Save changes. static-hosting-enable
  • So, we now created our prod bucket, follow the same steps above to create a dev bucket.

Step Two: Create a Github workflow to deploy dev and master

  • First things first, create a new branch. In my case I am calling it ci/release-workflow.
  • In the root of our project create a new folder .github under this folder create another folder workflows.
  • Then we need to add our AWS_ACCESS_KEY_ID & AWS_SECRET_ACCESS_KEY to github secrets.
  • In github page for the repository navigate to the secrets tab. Scroll down to the secrets section and select on Actions. secrets-section
  • Now click on the New repository secret and the AWS_ACCESS_KEY_ID & AWS_SECRET_ACCESS_KEY. repo-secrets
  • Under .github/workflows create a file deploy-release.yml and paste the following -

name: Deploy to S3

# Controls when the workflow will run
on:
  # Triggers the workflow when we merge code into the master & dev branches
  push:
    branches: 
      - master
      - dev

jobs:
  # This workflow contains a single job called "build"
  build:
    # The type of runner that the job will run on
    runs-on: ubuntu-latest

    # Steps represent a sequence of tasks that will be executed as part of the job
    steps:
      # Checks-out our repository
      - uses: actions/checkout@v3
      # Use the node action to use yarn
      - uses: actions/setup-node@v2
        with: 
          node-version: '16.x'

      # Configure AWS CLI
      - name: Configure AWS Credentials
        uses: aws-actions/configure-aws-credentials@v1
        with:
          aws-access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID }}
          aws-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
          aws-region: ap-south-1

      - name: Install Dependencies
        run: yarn install

      - name: Build Storybook
        run: yarn run build-storybook

      - name: Upload storybook build to the dev S3 bucket
        if: github.ref == 'refs/heads/dev'
        run: aws s3 sync ./storybook-static/ s3://react-vite-lib-dev --delete

      - name: Upload storybook build to the master S3 bucket
        if: github.ref == 'refs/heads/master'
        run: aws s3 sync ./storybook-static/ s3://react-vite-lib-prod --delete    
Enter fullscreen mode Exit fullscreen mode

Step Three: Workflow Explanation

  • Our workflow will run when we push or merge code in the master & dev branches.
on:
  push:
    branches: 
      - master
      - dev
Enter fullscreen mode Exit fullscreen mode
  • Under the jobs section we have defined a single job called build. And the type of OS it will run on.
jobs:
  build:
    runs-on: ubuntu-latest
Enter fullscreen mode Exit fullscreen mode
  • Steps will represent a sequence of tasks that our job will execute.
  • The first task we will checkout our code using the following action -
- uses: actions/checkout@v3
Enter fullscreen mode Exit fullscreen mode
  • Because I am using yarn locally as my package manager, I want to keep using it on my workflow. For that we use the following action -
- uses: actions/setup-node@v2
  with: 
    node-version: '16.x'
Enter fullscreen mode Exit fullscreen mode
  • To use the AWS CLI we will use the following action and pass our secrets
- name: Configure AWS Credentials
  uses: aws-actions/configure-aws-credentials@v1
  with:
    aws-access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID }}
    aws-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
    aws-region: ap-south-1
Enter fullscreen mode Exit fullscreen mode
  • Next 2 steps we will install our dependencies and build the project -
- name: Install Dependencies
  run: yarn install

- name: Build Storybook
  run: yarn run build-storybook
Enter fullscreen mode Exit fullscreen mode
  • If we have testing setup, we can run the test scripts just before our build.
  • Now we will upload the build artifact to the S3 bucket, I used the GitHub action if conditions. If we push the code to master branch it will upload our build artifact to the production bucket. If we push the code the dev branch it will upload our build artifact to the dev bucket.
- name: Upload storybook build to the dev S3 bucket
  if: github.ref == 'refs/heads/dev'
  run: aws s3 sync ./storybook-static/ s3://react-vite-lib-dev --delete

- name: Upload storybook build to the master S3 bucket
  if: github.ref == 'refs/heads/master'
  run: aws s3 sync ./storybook-static/ s3://react-vite-lib-prod --delete
Enter fullscreen mode Exit fullscreen mode
  • I used a single workflow just for the sake of this demo and also because we have a simple workflow that is deploying storybooks. You are free to create 2 separate workflows.

Step Four: Testing the Release Workflows

  • Now commit the code, push this branch and create a new Pull Request from the dev branch. Then merge this Pull Request into dev. This should trigger our workflow.
  • Head on to the Actions Tab and you will see our workflow, notice because we merged our code into dev it only ran the Upload storybook build to dev S3 bucket.
    github-action-release

  • Now head on to AWS S3 console choose our dev S3 bucket, under Properties tab navigate to the bottom to the Static website section click on the link to see our storybook in action.

  • Similarly, we will create a new pull request this time with master as the base branch, this will trigger the workflow and this time the build will be uploaded to the master / production bucket.

Step Five: Preview Workflow

  • Lets first create a new branch, ci/preview-workflow. Under the folder .github/workflows create a new file deploy-preview.yml and paste the following -

name: Deploy Storybook Preview to S3

# Controls when the workflow will run
on:
  # Triggers the workflow on pull request events for dev and master branches
  pull_request:
    branches: ["dev", "master"]

jobs:
  # This workflow contains a single job called "build"
  build:
    # The type of runner that the job will run on
    runs-on: ubuntu-latest

    # Steps represent a sequence of tasks that will be executed as part of the job
    steps:
      # Checks-out our repository
      - uses: actions/checkout@v3
      # Use the node action to use yarn
      - uses: actions/setup-node@v2
        with: 
          node-version: '16.x'

      # Configure AWS CLI
      - name: Configure AWS Credentials
        uses: aws-actions/configure-aws-credentials@v1
        with:
          aws-access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID }}
          aws-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
          aws-region: ap-south-1

      # Load env variables where we can access pr number, commit id
      - name: Load environment variables
        uses: FranzDiebold/github-env-vars-action@v2

      - name: Install Dependencies
        run: yarn install

      - name: Build Storybook
        run: yarn run build-storybook

      - name: Create a S3 Bucket
        run: aws s3api create-bucket --bucket $CI_HEAD_REF_SLUG-$CI_PR_SHA_SHORT --region ap-south-1 --create-bucket-configuration LocationConstraint=ap-south-1

      - name: Make the S3 bucket public
        run: aws s3api put-bucket-acl --bucket $CI_HEAD_REF_SLUG-$CI_PR_SHA_SHORT --acl public-read

      - name: Add Public accessibility to Buck Objects
        run: |
          aws s3api put-bucket-policy --bucket $CI_HEAD_REF_SLUG-$CI_PR_SHA_SHORT --policy '{
            "Version": "2012-10-17",
            "Statement": [
                {
                    "Sid": "PublicReadGetObject",
                    "Effect": "Allow",
                    "Principal": "*",
                    "Action": "s3:GetObject",
                    "Resource": "arn:aws:s3:::${{ env.CI_HEAD_REF_SLUG }}-${{ env.CI_PR_SHA_SHORT }}/*"
                }
            ]
          }'

      - name: S3 Bucket enable website hosting
        run: aws s3 website s3://$CI_HEAD_REF_SLUG-$CI_PR_SHA_SHORT/ --index-document index.html

      - name: Upload storybook build to the S3 bucket
        run: aws s3 sync ./storybook-static/ s3://$CI_HEAD_REF_SLUG-$CI_PR_SHA_SHORT --delete

      - name: Comment the website Link on the Pull Request 
        uses: peter-evans/create-or-update-comment@v2
        with: 
          issue-number: ${{ github.event.pull_request.number }}
          body: |
            Storybook is deployed successfully please visit the following link
            http://${{ env.CI_HEAD_REF_SLUG }}-${{ env.CI_PR_SHA_SHORT }}.s3-website.ap-south-1.amazonaws.com
          reactions: 'rocket'
Enter fullscreen mode Exit fullscreen mode

Step Six: Explaining the Preview Workflow

  • This workflow will be triggered when we create Pull Request or push new code to the Pull Request. It will create a S3 bucket with necessary permissions on the fly. Upload the build artifacts and leave a comment on our Pull Request with a link to the Deploy Preview.
  • Some steps are similar to the Release Workflow.
  • The following action is used to load in our action, environment variables like branch name, pull request number, commit id, etc.
- name: Load environment variables
  uses: FranzDiebold/github-env-vars-action@v2
Enter fullscreen mode Exit fullscreen mode
  • When we create a S3 bucket, we have to name the bucket. I am creating the bucket name using the branchName-commitid, branch name and the first 8 characters of the commit id. So that our bucket name is unique.
 - name: Create a S3 Bucket
   run: aws s3api create-bucket --bucket $CI_HEAD_REF_SLUG-$CI_PR_SHA_SHORT --region ap-south-1 --create-bucket-configuration LocationConstraint=ap-south-1
Enter fullscreen mode Exit fullscreen mode
  • Next using the aws cli we make the bucket public -
- name: Make the S3 bucket public
  run: aws s3api put-bucket-acl --bucket $CI_HEAD_REF_SLUG-$CI_PR_SHA_SHORT --acl public-read
Enter fullscreen mode Exit fullscreen mode
  • Then we add our bucket policy to make all objects of the bucket publicly accessible -
- name: Add Public accessibility to Buck Objects
  run: |
          aws s3api put-bucket-policy --bucket $CI_HEAD_REF_SLUG-$CI_PR_SHA_SHORT --policy '{
            "Version": "2012-10-17",
            "Statement": [
                {
                    "Sid": "PublicReadGetObject",
                    "Effect": "Allow",
                    "Principal": "*",
                    "Action": "s3:GetObject",
                    "Resource": "arn:aws:s3:::${{ env.CI_HEAD_REF_SLUG }}-${{ env.CI_PR_SHA_SHORT }}/*"
                }
            ]
    }'
Enter fullscreen mode Exit fullscreen mode
  • Then we enable static hosting for our bucket -
- name: S3 Bucket enable website hosting
  run: aws s3 website s3://$CI_HEAD_REF_SLUG-$CI_PR_SHA_SHORT/ --index-document index.html
Enter fullscreen mode Exit fullscreen mode
  • We upload the build artifact to the bucket -
- name: Upload storybook build to the S3 bucket
  run: aws s3 sync ./storybook-static/ s3://$CI_HEAD_REF_SLUG-$CI_PR_SHA_SHORT --delete
Enter fullscreen mode Exit fullscreen mode
  • Finally, we use the following action to comment on the Pull Request the deploy preview link, to this action we have to provide the pull request number -
- name: Comment the website Link on the Pull Request 
  uses: peter-evans/create-or-update-comment@v2
  with: 
    issue-number: ${{ github.event.pull_request.number }}
    body: |
    Storybook is deployed successfully please visit the following link
 http://${{ env.CI_HEAD_REF_SLUG }}-${{ env.CI_PR_SHA_SHORT }}.s3-website.ap-south-1.amazonaws.com
    reactions: 'rocket'
Enter fullscreen mode Exit fullscreen mode

Step Seven: Test the Preview Workflow

  • Now commit the code, push this branch and create a new Pull Request from the dev branch. This will kick the Release workflow.

trigger-workflow

  • When the workflow completes successfully you can check the Deploy Preview comment on the Pull Request. deploy-preview
  • Visit the link from the Deploy preview comment. Also try changing the code and pushing another commit to this Pull Request you will see the workflow will trigger again and when it finishes you will see a new Deploy Preview link.

Summary

  • Would I use this workflow for my projects and if yes then how often? Well, it depends, it all boils down to using the right tools for the right job.
  • For my use case I would use this workflow for small POCs, storybook deployments. But I think Vercel and Netlify they do an awesome job of handling all the things we did manually.
  • If you are very much invested into AWS then Amplify static hosting does all this for us. If you are into Azure then Azure Static Web apps does an awesome job.

  • What more features can we add to our current setup ? Well we can create a Dashboard with a list of our deployments, using a no code / low code tool like Appsmith for instance.

  • We can delete our Deploy Preview S3 buckets by adding a lifecycle policy, add useful tags to the deploy preview buckets, etc.

  • I would also create an API using Lambda to query all this information, grab information from github on the workflows, deployments, etc.

  • Another main thing to consider is the pricing factor when evaluating various platforms.

I hope you found this tutorial useful. All the code for this tutorial can be found here. The main goal of this tutorial was the share this S3 workflow with the community. Please share your feedback. Your constructive feedback will be highly appreciated. Feel free to ask any doubts / queries that you may have. Until next time PEACE.

Top comments (2)

Collapse
 
diogo405 profile image
Diogo Goncalves

have a look at aws amplify it does that for you already 😀

Collapse
 
yaldram profile image
Arsalan Ahmed Yaldram

Hey, yes, I have shared other hosting platforms in my summary. I have this habit of asking how something works what all things are involved. We use Vercel in our company, I wanted to do all of this on my own.