DEV Community

Rebeca Murillo
Rebeca Murillo

Posted on • Originally published at rebeca.murillo.link

CI/CI deploy a static website to AWS S3 bucket through Github Actions

Introduction

Welcome to this tutorial where we'll delve into the process of setting up continuous integration and continuous delivery (CI/CD) for deploying a static website to an Amazon S3 bucket directly from your GitHub repository. We'll walk through the process of setting up GitHub Actions in your project, configuring your AWS account, and ensuring that your website is running smoothly after each deployment.

This configuration might take some time to setup, but it will help you save time afterwards ! Your website will be updated automatically with one click, just by running the Github workflow on your main branch.

This tutorial breaks down in three steps :

  1. Github Actions workflow
  2. Setub AWS Role to connect with Github Actions
  3. Setup Github Actions secrets and run the workflow

Requirements

To follow along this tutorial, you will need to setup in advance the following :

Through this guide, you'll gain an in-depth understanding of building, quality checking, and deploying your static website, and finally checking the production site. Let's get started on this exciting journey of automating your website deployment!

Github Actions workflow

To kickstart the process, we'll set up GitHub Actions in the project.

  • Add a file under .github/workflows/build-deploy.yml in your project directory.
  • Copy the following content, adapt according to your needs.
name: Build and deploy to S3 bucket

on: workflow_dispatch

permissions:
  id-token: write
  contents: read

jobs:
  build:
    runs-on: ubuntu-latest

    steps:
      - name: Checkout Repository
        uses: actions/checkout@v4
      - name: Set up Node.js
        uses: actions/setup-node@v4
        with:
          node-version: 18
      - name: Install Dependencies
        run: npm install
      - name: Build and Output to Dist
        run: npm run build
      - name: Upload Build Artifacts
        uses: actions/upload-artifact@v3
        with:
          name: build-artifacts
          path: dist

  quality:
    runs-on: ubuntu-latest
    needs: build
    steps:
      - name: Checkout Repository
        uses: actions/checkout@v4
      - name: Download Build Artifacts
        uses: actions/download-artifact@v3
        with:
          name: build-artifacts
          path: dist
      - name: Set up Python
        uses: actions/setup-python@v4
        with:
          python-version: "3.10"
      - name: Run localhost server
        run: |
          cd dist ; ls -la ;
          python -m http.server 8080 &
      - name: Wait for Local Server to Start
        run: sleep 3
      - name: Check homepage status in localhost
        run: |
          URLs=("http://localhost:8080/" "http://localhost:8080/en/" "http://localhost:8080/es/" "http://localhost:8080/fr/")
          for url in "${URLs[@]}"; do
              response_code=$(curl -w "%{http_code}" -I "$url" -o /dev/null)
              echo -e "e HTTP Status Code for $url: $response_code"
              echo "HTTP Status Code for $url: $response_code"

              if [ "$response_code" -ne 200 ]; then
                  echo -e "::error::Error: Unexpected status code for $url"
                  exit 1
              fi
          done

  aws_deploy:
    runs-on: ubuntu-latest
    needs: [build, quality]
    steps:
      - name: Download Build Artifacts
        uses: actions/download-artifact@v3
        with:
          name: build-artifacts
          path: dist
      - name: Configure AWS Credentials
        uses: aws-actions/configure-aws-credentials@v4
        with:
          role-to-assume: ${{ secrets.AWS_ROLE_ARN }}
          aws-region: ${{ secrets.AWS_REGION }}
      - name: Empty S3 Bucket
        run: aws s3 rm s3://${{ secrets.AWS_S3_BUCKET }} --recursive
      - name: Upload to S3 bucket
        run: |
          ls -la dist/
          aws s3 cp dist/. s3://${{ secrets.AWS_S3_BUCKET }} --recursive
      - name: Clean CloudFront Cache
        run: aws cloudfront create-invalidation --distribution-id ${{ secrets.AWS_CLOUDFRONT_DISTRIBUTION_ID }} --paths "/*"

  healthcheck:
    runs-on: ubuntu-latest
    needs: aws_deploy
    steps:
      - name: Checkout Repository
        uses: actions/checkout@v4
      - name: Check homepage status @ https://mywebsite
        run: |
          URLs=("https://rebeca.murillo.link/" "https://rebeca.murillo.link/en/" "https://rebeca.murillo.link/es/" "https://rebeca.murillo.link/fr/")
          for url in "${URLs[@]}"; do
              response_code=$(curl -w "%{http_code}" -I "$url" -o /dev/null)
              echo -e "e HTTP Status Code for $url: $response_code"
              echo "HTTP Status Code for $url: $response_code"

              if [ "$response_code" -ne 200 ]; then
                  echo -e "::error::Error: Unexpected status code for $url"
                  exit 1
              fi
          done
Enter fullscreen mode Exit fullscreen mode

The workflow can be triggered by a manual run or you can customize it to run on conditions like a branch push on the main branch. Refer to Github Actions Events that trigger workflows documentation

The workflow is partitioned into four different jobs:

  1. Build
  2. Quality check on localhost
  3. S3 bucket deployment
  4. Checking the production site

Let's go in detail into each job.

Note that the permissions is mandatory for aws-actions/configure-aws-credentials to authenticate !

permissions:
 id-token: write
 contents: read

1. Build Job

The first step is to build the static website source, following your website configuration and framework. In this tutorial, the project is built with the Astro Framework, so we run run a script to install Node dependencies with Node.js and then build the source code.

  • Setup environment with Nodejs and install dependencies with npm install, with Github Actions setup Node
  • Build the static website final sources with command npm run build, by default Astro framework saves the built sources in the dist/ folder.
  • The content of the build destination folder folder needs is saved and transferred to the following jobs in the workflow. We do this with the Github actions actions/upload-artifact

The subsequent jobs will be running the Github Actions Download Artifact, in order to retrieve the dist/ folder content.

2. Quality Check on localhost

In this step you can add your project's test command, if you have one. Like for example npm run tests

The second job involves running a quality check. For our static website example, the quality check consists on hosting the website locally and calling the homepage, or any other important pages.

  • The source files from the dist/ folder from our first job are used, retrieved with Github Actions Download Artifact
  • Run a local host with the static website content with Python setup Github Actions
  • Check some of the website pages are functioning as expected. If any of the pages are not responding properly (200 status), the exit 1 is triggered, causing the GitHub workflow to fail. This will prevent the deployment of the source code in our S3 bucket.

3. S3 Bucket Deployment

After checking that the source generated code is running correctly, the third job will be deploying the source code to the AWS S3 bucket.

  • The source files from the dist/ folder from our first job are used, retrieved with Github Actions Download Artifact
  • The AWS configure-aws-credentials Github Action allows the connection to the AWS S3 bucket through an AWS Role. The configuration of this role is explained in the next chapter
  • The AWS bucket hosting the static website is updated by first deleting the old content, and then uploading the updated source files.
  • Finally the CloudFront cache is cleaned, as it may not immediately reflect the website changes.

4. Checking the Production Site

The final job repeats the quality check from the second job, but this time, on the final website URL. I've set it up to check four pages, the homepage in three different languages, to ensure that the site is up and running after deployment.

In the next section, we will look into AWS configurations necessary for this CI/CD setup.

Setub AWS Role to connect with Github Actions

Moving on, we are now going to set up our AWS account to connect it with GitHub Actions. To accomplish this, we need to create a role in AWS. This role will be used to configure the credentials in the GitHub Actions workflow.

Refer to the AWS documentation for more details about this process : Use IAM roles to connect GitHub Actions to actions in AWS

1. Create an OIDC provider for Github

To start off, we need to create an identity provider for GitHub.

  • In the AWS Console, go to Identity and Access Management (IAM), under Access Management > Identity Providers
  • Click on 'Add a new provider' :
    • provider type : 'Open ID Connect Provider'
    • provider URL :token.actions.githubusercontent.com
    • Audience : sts.amazonaws.com.

Once we have our identity provider established, we can proceed to create the role.

2. Create the AWS Role

  • In the AWS Console, go to Identity and Access Management (IAM), under Access Management > 'Roles'
  • Click to create a "New role" :
    • Entity type: Web identity.
    • Select the identity provider we created in the previous step. The audience will be the default one.
    • The GitHub organization is essentially your GitHub account where the project is hosted.
    • The name of your GitHub repository
    • The branch in your Github repository that will be allowed to run the Github Actions workflow
  • In the Permissions Policies step, setup the rules according to the actions that will be executed by the Github Actions workflow. For this tutorial, the permissions required are :
    • AWS S3 bucket management: to delete and upload content in the bucket
    • AWS Cloudfront management: to reset the CDN cache
  • Save the new role

In the trust policy, you can verify important information regarding your Github repository permissions and the concerned branch. You can adapt this to your use case.
Upon creating the role, open it to copy the ARN (Amazon Resource Name), which is needed in the next step for setting up the secrets in GitHub.

Setup Github Actions secrets and run the workflow

Back in your GitHub project, set up the required secrets for the workflow.

  1. Go to your GitHub project settings,
  2. Then to 'Secrets and Variables' under 'Actions'
  3. Finally create a secret for each variable with 'Create a new repository secret'. In this tutorial, the secrets used in the workflow include:
    • AWS_ROLE_ARN: the ARN from the AWS role created in the previous step
    • AWS_REGION: the AWS region for the bucket
    • AWS_S3_BUCKET: AWS S3 bucket name
    • AWS_CLOUDFRONT_DISTRIBUTION_ID: the AWS CloudFront distribution ID

With all these configurations in place, your AWS account is now set up and ready to be connected with your GitHub Actions.

To run the workflow, navigate to the 'Actions' tab in the GitHub repository, select the workflow and run it from the corresponding branch.

Once the workflow has started, you can monitor its progress. It will sequentially execute the four jobs we outlined earlier: build, quality check on localhost, S3 bucket deployment, and checking the production site.

Note that choosing the wrong branch will result the workflow to return errors. The branch must be allowed in the AWS role policies.

I hope this tutorial gives you a clear understanding of how the AWS role and GitHub Actions work together to ensure secure and controlled deployments of your website. The process is now fully automated, and the permissions are set up to prevent unauthorized deployments, ensuring that your website remains secure and updated with only one click !

Top comments (1)

Collapse
 
rdarrylr profile image
Darryl Ruggles

GitHub Actions and workflows are a great tool! Thanks for sharing this.