Hi!!
A few days ago, I worked with a developer on a simple landing page with zero complications. He needed a simple way to visualize his changes, obviously in our current AWS infrastructure.
The best way to achieve this was with Github Actions + S3 Static websites, and I'll tell you how I did it.
Steps
- Provision an S3 Bucket and IAM User with Cloudformation.
- Add the AWS secrets from our IAM user to the Github repository.
- Deploy an app to an S3 Bucket through Github actions.
1. Resources provisioning 🏗️
First, we must deploy our resources by AWS Cloudformation. In the following cf template (main.yml), it's defined:
- S3 Bucket
- Bucket Policy
- IAM User
- IAM Policy inline
# main.yml | |
AWSTemplateFormatVersion: 2010-09-09 | |
Parameters: # params passed to "--parameter-overrides" in CLI | |
BucketName: # http://your_bucket_name.s3-website-your_region.amazonaws.com/ | |
Description: Unique name for your bucket. This will be in the S3 url to your static website. | |
Type: String | |
Resources: | |
# Create an S3 Bucket that serves a static website (i.e. React app) | |
MyBucket: | |
Type: AWS::S3::Bucket | |
Properties: | |
BucketName: !Ref BucketName | |
AccessControl: PublicRead # important!! | |
WebsiteConfiguration: # this makes the S3 Bucket a static website/app | |
IndexDocument: index.html # default object served when visiting S3 domain | |
ErrorDocument: index.html # just send to app, let React handle errors and routing | |
# Add a Bucket Policy that lets public visitors access the web app | |
MyBucketPolicy: | |
Type: AWS::S3::BucketPolicy | |
Properties: | |
Bucket: !Ref MyBucket # attach to bucket being created | |
PolicyDocument: | |
Id: MyPolicy | |
Version: 2012-10-17 | |
Statement: # lets the public access/view the contents of your Bucke t, i.e. web app | |
- Sid: PublicReadForGetBucketObjects | |
Effect: Allow | |
Principal: "*" # wildcard, allow all requests | |
Action: "s3:GetObject" | |
Resource: | |
- !Join ["", [!GetAtt MyBucket.Arn, /*]] | |
iamUser: | |
Type: AWS::IAM::User | |
Properties: | |
Path: / | |
Policies: | |
- PolicyName: s3-upload-policy | |
PolicyDocument: | |
Version: "2012-10-17" | |
Statement: | |
- Effect: Allow | |
Action: | |
- "s3:PutObject" | |
- "s3:GetObject" | |
- "s3:ListBucket" | |
- "s3:DeleteObject" | |
Resource: | |
- !Join ["", [!GetAtt MyBucket.Arn, /*]] | |
Outputs: | |
WebsiteURL: | |
Value: !GetAtt MyBucket.WebsiteURL | |
Description: URL for website hosted on S3 |
If you have already installed AWS CLI:
aws cloudformation create-stack --stack-name your-stack-name --capabilities CAPABILITY_IAM --template-body file://main.yml --parameters ParameterKey=BucketName,ParameterValue=your-unique-bucket-name
Otherwise:
For this need to login into the AWS Console -> Cloudformation -> Stack -> Create Sack -> Upload a template file.
It will ask you for confirmation that this cloudformation template will create IAM resources, in this case, a user.
2. Secrets 🤫
Now, we go for the IAM User
: https://console.aws.amazon.com/iam/
IAM -> Users -> your-new-iam-user -> Security Credentials
Access Key -> Create access key
We need to set AWS_ACCESS_KEY_ID
and AWS_SECRET_ACCESS_KEY
in the GitHub repository's secrets section required to upload files to our S3 bucket.
S3_BUCKET
: The name of the destination bucket
S3_BUCKET_REGION
: The destination bucket region
Settings -> Secrets -> New repository secret
3. Github Actions 🚀
I highly recommend using this continuous deployment only for stage environments. For that reason, we'll create a QA branch.
git checkout -b QA
Creating branch from Github UI
Next, we'll create our deploy.yml file in the path: .github/workflows
mkdir -p .github/workflows && touch .github/workflows/deploy.yml
If you want to run it on a specific branch, change branches
#.github/workflows/deploy.yaml | |
name: CD Stage | |
on: | |
push: | |
branches: | |
- QA | |
- '!master' | |
jobs: | |
deploy: | |
runs-on: ubuntu-20.04 | |
env: | |
AWS_ACCESS_KEY_ID: '${{ secrets.AWS_ACCESS_KEY_ID }}' | |
AWS_SECRET_ACCESS_KEY: '${{ secrets.AWS_SECRET_ACCESS_KEY }}' | |
steps: | |
- uses: actions/checkout@v2 | |
- name: Deploy QA | |
uses: reggionick/s3-deploy@v3 | |
with: | |
folder: . | |
bucket: '${{ secrets.S3_BUCKET }}' | |
bucket-region: '${{ secrets.S3_BUCKET_REGION }}' | |
delete-removed: true | |
no-cache: true | |
private: true |
Github action: https://github.com/Reggionick/s3-deploy.
Our app is already deployed by Github Actions!
Done! 🙌
Check your S3 WebsiteURL from the last Cloudformation output:
http://your_bucket_name.s3-website-your_region.amazonaws.com/
For HTTPS: Configuring a static website using a custom domain registered with Route 53
Bonus 🎁
If you work with react, the following pipeline can help you:
#.github/workflows/react_deploy.yml | |
name: CD Stage | |
on: | |
push: | |
branches: | |
- QA | |
- '!master' | |
jobs: | |
deploy: | |
runs-on: ubuntu-20.04 | |
env: | |
AWS_ACCESS_KEY_ID: '${{ secrets.AWS_ACCESS_KEY_ID }}' | |
AWS_SECRET_ACCESS_KEY: '${{ secrets.AWS_SECRET_ACCESS_KEY }}' | |
steps: | |
- uses: actions/checkout@v2 | |
- name: Set up nodejs 14 LTS | |
uses: actions/setup-node@v2 | |
with: | |
node-version: '14' | |
- name: Cache node_modules | |
uses: actions/cache@v2.0.0 | |
with: | |
path: node_modules | |
key: ${{ runner.os }}-node-${{ hashFiles('**/package-lock.json') }} | |
restore-keys: | | |
${{ runner.os }}-node-${{ env.cache-name }}- | |
${{ runner.os }}-node- | |
${{ runner.os }}- | |
- name: ⛏ Building | |
run: npm install && npm run build | |
- name: Deploy QA | |
uses: reggionick/s3-deploy@v3 | |
with: | |
folder: build | |
bucket: '${{ secrets.S3_BUCKET }}' | |
bucket-region: '${{ secrets.S3_BUCKET_REGION }}' | |
delete-removed: true | |
no-cache: true | |
private: true | |
react-router 404 errors: https://via.studio/journal/hosting-a-reactjs-app-with-routing-on-aws-s3
cors-cloudformation example: https://git.io/JtQZK
If you have any comments or improvements for this post, I would be glad to receive them.
Thank you for your time 👋👨💻.
Top comments (0)