Have you ever wondered how to automatically build and deploy your static site projects without relying on platforms like Vercel or Netlify? If so, you're in the right place.
Introduction
In this article, we'll walk through how to set up GitHub Actions to build and deploy your static website directly to AWS S3, giving you full control over your CI/CD pipeline — no third-party hosting required.
You might be wondering — what exactly is CI/CD?
CI/CD stands for Continuous Integration and Continuous Deployment. It’s a development practice that automates the process of building, testing, and deploying your code every time you push changes to your repository.
Instead of manually uploading files to a server every time you make updates, CI/CD pipelines handle that for you. This means fewer interruptions, fewer mistakes, and more time to focus on building, not deploying.
Prerequisites
Before we get started, make sure you have the following:
- AWS Account: You will need an AWS account to create an S3 bucket and host your site.
- GitHub Account: Required to access GitHub Actions.
- Project: A git initialized project to deploy. This guide is framework-agnostic, so you can use any Node.js-based project that generates a static site (e.g., React.js with CRA or Vite, Next.js with next export)
Example Project Setup
In this guide, I am taking a Vite-React project as an example
npm create vite@latest react-auto-deploy -- --template react-ts
Understanding GitHub Actions
What is a Workflow?
A workflow in GitHub Actions is an automated process that runs one or more jobs. It's defined in a .yml
(YAML) file inside the .github/workflows/
directory of your repository.
Each workflow can be triggered by specific GitHub events such as:
-
Pushes to a branch (e.g., pushing code to
main
) - Pull requests (e.g., opening a PR to review changes)
-
Manual triggers (e.g., running it with a button click using
workflow_dispatch
) -
Scheduled triggers (e.g., every day at midnight using
cron
) - Releases, issue comments, or other custom events
Example workflow file structure:
react-auto-deploy/
├── .github/
│ └── workflows/
│ └── deploy.yml ← your workflow definition
A workflow file defines what should happen and when, such as:
- Install dependencies
- Build your site
- Run tests
- Deploy to AWS S3 or other platforms
Step-by-Step Guide
- Create an S3 bucket.
- Go to the AWS S3 Console
- Click "Create bucket”
- Enter a unique bucket name (e.g., my-static-site)
- Choose a region
- Uncheck "Block all public access" under permissions
- Confirm the warning checkbox
- Click "Create bucket"
-
Enable Static Website Hosting
- Go to the "Properties" tab
- Scroll to "Static website hosting”
- Click "Edit”
- Enable "Static website hosting”
- set index document:
index.html
- Click "Save changes”
-
Make Files Public
- Go to the "Permissions" tab
- Scroll to "Bucket policy"
- Click "Edit", and paste the following policy (replace your-bucket-name):
{ "Version": "2012-10-17", "Statement": [ { "Sid": "PublicReadGetObject", "Effect": "Allow", "Principal": "*", "Action": "s3:GetObject", "Resource": "arn:aws:s3:::your-bucket-name/*" } ] }
Replace
your-bucket-name
with your unique bucket name. This step makes your bucket publicly accessible. Go to GitHub and create your repository
Create a .github/workflows/deploy.yaml file at the root of your project (locally or on GitHub)
Paste the following config (I will explain the config below)
name: Build and Upload So Sure Dashboard
on:
push:
branches:
- main
jobs:
deploy-stagging:
runs-on: ubuntu-latest
steps:
# step 1
- name: Checkout repository
uses: actions/checkout@v3
# step 2
- name: Use Node.js 20.x
uses: actions/setup-node@v3
with:
node-version: 20
# step 3
- name: Install dependencies
run: npm install
#step 4
- name: Create .env file
run: |
cat <<EOF > ./.env
${{ secrets.ENV }}
EOF
#step 5
- name: Build project
run: npm run build
#step 6
- name: Configure AWS credentials
uses: aws-actions/configure-aws-credentials@v2
with:
aws-access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID }}
aws-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
aws-region: ${{ secrets.AWS_REGION }}
#step 7
- name: Upload dist to S3
run: |
aws s3 sync dist/ s3://${{ secrets.AWS_S3_BUCKET }}/ --delete
The first section sets the name and trigger for the workflow. Whenever code is pushed to the main branch, the workflow will automatically run the defined jobs.
In the jobs section, the first three steps clone the repository to the GitHub Actions container (running Ubuntu-latest), install Node.js, and download all the node_modules required to build the project.
You can skip the - name: Create .env file
step if you don't have any environment variables. This step takes the repository secret ENV
and creates a .env
file at the root of the project.
In the next step, the project is built. It's important to set up the correct scripts in package.json
to make your project run properly.
The next step configures the AWS client to use your account via the AWS CLI.
Finally, step 7 uploads the output folder to your S3 bucket.
- Before you commit and push the workflow file, set up the repository secrets on GitHub.
- Navigate to your repository on GitHub and click the Settings tab.
- In the Security section, navigate to Secrets and Variables > Actions.
- Here, set up all the secrets used in the above workflow file.
- Be careful and set up the same variables as used in the YAML file
AWS_ACCESS_KEY_ID
, AWS_REGION
, AWS_S3_BUCKET
, AWS_SECRET_ACCESS_KEY
, ENV
-
AWS_S3_BUCKET
is your s3 bucket name -
AWS_REGION
is your s3 bucket region -
AWS_ACCESS_KEY_ID
andAWS_SECRET_ACCESS_KEY
can be generated from the AWS console IAM role. -
ENV
all your .env file variables needed at build time. Copy all the contents of the .env file and paste it here as the value
Now that you've set up your GitHub repository with the necessary secrets and workflow file, it's time to commit your changes and see the automation in action. The next time you push to the main branch, GitHub Actions will automatically build your project and deploy it to AWS S3.
Testing Your Deployment
After pushing your code, you can monitor the workflow execution by going to the "Actions" tab in your GitHub repository. You should see your workflow running, and once completed successfully, your website will be live on your S3 bucket's URL.
Access Your Site
- Go back to the "Properties" tab
- Under "Static website hosting", copy the "Bucket website endpoint" (e.g.,
http://your-bucket-name.s3-website-us-east-1.amazonaws.com
) - Open it in your browser — your site should be live!
Common Issues and Troubleshooting
When implementing this deployment pipeline, you might encounter a few common issues. Here are some troubleshooting tips to help you resolve them quickly:
- Permission Errors: Ensure your IAM user has the necessary permissions for S3 operations (s3:PutObject, s3:GetObject, s3:DeleteObject).
- Build Failures: Check your workflow logs carefully. Most build failures are due to missing dependencies or environment variables.
- Cache Issues: If your updates aren't showing up, you might need to configure cache invalidation for your S3 bucket, especially if you're using CloudFront.
Conclusion
And that’s it! You’ve just set up automatic deployment for your static site using GitHub Actions and AWS S3. No more manual uploads — push your code and let GitHub do the rest. It saves time, reduces errors, and keeps your workflow smooth.
Top comments (0)