With steps 2-6 complete, I decided to go out of order and complete step 15 now rather than later. Step 15 is creating a GitHub action to automatically push code changes in the repo to the S3 bucket, thus creating a CI/CD process to automate updating the front end.
I decided to set this up now because I hate manual processes that can (should) be automated. My next task is adding the Javascript counter and DynamoDB table. That could involve a lot of changes and testing with the HTML in the S3 bucket and I'd rather not copy the files to S3 manually after every change.
The setup for this automation wasn't too hard:
- Create an IAM policy to allow updating the files in the S3 bucket:
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "SyncToS3",
"Effect": "Allow",
"Action": [
"s3:PutObject",
"s3:GetObject",
"s3:ListBucket",
"s3:DeleteObject",
"s3:GetBucketLocation"
],
"Resource": [
"arn:aws:s3:::BUCKET-NAME",
"arn:aws:s3:::BUCKET-NAME/*"
]
}
]
}
- Create an IAM user with only programmatic access and assign the policy.
- Add the IAM user's access key and secret access key to the GitHub repo's secrets.
- Create the GitHub Action:
name: Upload to S3
on:
push:
branches:
- master
jobs:
deploy:
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v1
- name: Configure AWS Credentials
uses: aws-actions/configure-aws-credentials@v1
with:
aws-access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID }}
aws-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
aws-region: us-west-1
- name: Deploy static site to S3 bucket
run: aws s3 sync FOLDER-NAME s3://BUCKET-NAME --delete
Now when a change is pushed to the GitHub repo the file is automatically uploaded to the S3 bucket.
UPDATE
To have the action only run when files are changed in a certain folder, use the paths
option. This enables different actions to be run for things like syncing the front_end folder to S3 and the back_end folder to API Gateway.
on:
push:
branches:
- master
paths:
- 'FOLDER-NAME/**'
Top comments (0)