DEV Community

Cover image for Deploying a Static Website Infrastructure with GitHub Actions
Kaye Alvarado for AWS Community Builders

Posted on • Edited on

Deploying a Static Website Infrastructure with GitHub Actions

Introduction to "Infrastructure as Code"

This is a quick tutorial of setting up a secure static website in AWS using Terraform (video). I previously talked about a few AWS services (basic S3 setup and AWS CI/CD services) to explore how to do this on the video. The full article where I discussed the step-by-step process can also be found here.

On this previous session, you'll learn how to setup a full static website (React.js boilerplate) in an S3 architecture and add a pipeline for your website code to continuously deploy to the S3 bucket.

To extend on that, in this article, I focused more on having an "Infrastructure as Code" to continuously deploy the architecture across different environments or for any use case that needs infrastructure deployment in a repetitive manner.

Hashicorp's Definition

Infrastructure as code (IaC) tools allow you to manage infrastructure with configuration files rather than through a graphical user interface. IaC allows you to build, change, and manage your infrastructure in a safe, consistent, and repeatable way by defining resource configurations that you can version, reuse, and share.

The Architecture

Image description

The AWS architecture of setting this up is straightforward.

  1. S3 is the main component of all the AWS services which is basically a storage service that provides the capability to host the static files needed for the website, provide security in accessing those files, configure error handling, and provide an endpoint for the landing page.
  2. Cloudfront is a low latency content delivery network that can provide caching to the files being accessed by clients across different locations
  3. Route 53 is the Domain Name System (DNS) service of AWS that can also provide routing policies as needed
  4. To ensure that the website is secure, an IAM user can be setup to restrict write access to the bucket.

Prerequisites

  • Terraform is installed
  • AWS CLI is installed
  • AWS Account and credentials that has access to create AWS resources

Setup

Terraform To install Terraform on Windows, I downloaded the binary package provided here. I then unzipped the package and copied the executable file in my C:/Program Files (x86)/Terraform folder. After which, I needed to add this folder in my $PATH variable which can be found in Control Panel > System > Advanced System Settings > Advanced Tab > Environment Variables. I updated the $PATH variable then did a reset for it to reflect. In the command line, I ran the following command to ensure that it is now recognized.

echo $PATH
Enter fullscreen mode Exit fullscreen mode

Then, run the following command to check if Terraform is successfully installed. It should display the version of the executable file.

terraform --version
Terraform v1.2.5
on windows_386
Enter fullscreen mode Exit fullscreen mode

AWS CLI can be installed by downloading and running the msi package that can be found here. After the installation, verify installation by checking the version.

aws --version
aws-cli/2.4.18 Python/3.8.8 Windows/10 exe/AMD64 prompt/off

Enter fullscreen mode Exit fullscreen mode

AWS Account and Credentials You need to create an IAM user that has the required access to create the resources defined in the architecture. To do this, login to the AWS Management Console and go to IAM service. For the purpose of simplicity, you can create an IAM user with administrator access, but ideally, you'd want a user with rights of least privilege. I also talked about the steps for this in this video.
After setting up the user and saving the AWS Access Key ID and AWS Secret Access Key, you can configure your environment by running the following command

aws configure
AWS Access Key ID [**********ABCD]: <Input here>
AWS Secret Access Key [**********abcd]: <Input here>
Default region name [ap-southeast-1]: <Input here>
Default output format [None]: 
Enter fullscreen mode Exit fullscreen mode

Then test if you're able to connect by running the following command

aws sts get-caller-identity
{
    "UserId": "<UserID>",
    "Account": "<Account>",
    "Arn": "arn:aws:iam::<Account>:user/aws-cli"
}
Enter fullscreen mode Exit fullscreen mode

Terraform Code

You can checkout the code from this GitHub Repository. To follow the steps on this section, I would recommend going thru this tutorial from Hashicorp on the basics of Terraform for AWS.

  1. From the code, navigate to the static website folder where the main.tf file is in
cd terraformstuff\staticwebsite
Enter fullscreen mode Exit fullscreen mode
  1. Initialize terraform by running terraform init. From Hashicorp

The terraform init command is used to initialize a working directory containing Terraform configuration files. This is the first command that should be run after writing a new Terraform configuration or cloning an existing one from version control.

PS C:\Users\Kaye\workspace\terraformstuff\staticwebsite> terraform init
Initializing modules...

Initializing the backend...

Initializing provider plugins...
- Reusing previous version of hashicorp/aws from the dependency lock file
- Using previously-installed hashicorp/aws v4.23.0

Terraform has been successfully initialized!

You may now begin working with Terraform. Try running "terraform plan" to see
any changes that are required for your infrastructure. All Terraform commands
should now work.

If you ever set or change modules or backend configuration for Terraform,
rerun this command to reinitialize your working directory. If you forget, other
commands will detect it and remind you to do so if necessary.
Enter fullscreen mode Exit fullscreen mode
  1. Now that Terraform is initialized, you can try to run a plan to initially view the list of resources being created.

From Hashicorp

The terraform plan command evaluates a Terraform configuration to determine the desired state of all the resources it declares, then compares that desired state to the real infrastructure objects being managed with the current working directory and workspace.

  1. The plan can also be viewed when running terraform apply which also gives an option to build the infrastructure in AWS with a yes confirmation.
Plan: 4 to add, 0 to change, 0 to destroy.

Do you want to perform these actions?
  Terraform will perform the actions described above.
  Only 'yes' will be accepted to approve.

  Enter a value: yes
Enter fullscreen mode Exit fullscreen mode
  1. After running the code, you should be able to view the created resources in your AWS Management Console.

Note: As of this writing, the terraform code I created only builds the Cloudfront and S3 resources and I built the Route 53/ACM resources manually. The steps to do this can also be seen from the previous video I shared above.

Upload a Static Website in S3!

I manually uploaded an index.html file in S3 to test out if everything works.
Image description
Image description

Develop the Pipeline of your website

There is a workflow sample in GitHub Actions Marketplace that allows you to easily setup the CI/CD pipeline of your static website. You can refer to the steps here on how to set it up: https://github.com/marketplace/actions/configure-aws-credentials-action-for-github-actions.

Here is the screenshot of setting up your IAM with an Identity Provider.
Image description
In your IAM role, setup the following under the Trust Relationship

{
          "Effect": "Allow",
          "Principal": {
            "Federated": "<ARN of the OIDC>"
          },
          "Action": "sts:AssumeRoleWithWebIdentity",
          "Condition": {
            "StringEquals": {
              "token.actions.githubusercontent.com:aud": "sts.amazonaws.com",
              "token.actions.githubusercontent.com:sub": "repo:GitHubOrg/GitHubRepo:ref:refs/heads/GitHubBranch"
            }
          }
        }
Enter fullscreen mode Exit fullscreen mode

That should sort out the connectivity/access. Try it out by deploying a sample code, check in S3 if the file has been updated. You can also update the workflow to add invalidation in cloudfront to ensure that everything in the cache is invalidated during deployment.

aws cloudfront create-invalidation --distribution-id --paths {{ secrets.CLOUD_DIST }} "/*"

Link to my GitHub workflow project is here if you want to clone and create your own project!

Top comments (0)