In this blog, I implemented a cloud-based Terraform workflow using HCP Terraform integrated with Github to provision an AWS S3 in a prodcution style setup.
>> Project Objective:
The goal was to:
- Define AWS infrastructure using Terraform
- Store and version control the code in Github
- Execute Terraform runs a remotely using HCP Terraform
- Implement a VCS- driven automated workflow
- Manage state securely in the cloud
- Isolate environments using Projects and Workspaces
>> Architecture Overview:
The deployment workflow follows this structure:
Developer -> Github -> HCP -> Terraform -> AWS -> S3 Bucket
Execution Flow
- Write Terraform configuration for S3 bucket.
- Push the code to GitHub.
- HCP Terraform detects the change.
- Automatically runs terraform init and terraform plan.
- Review the plan in the UI.
- Confirm and apply the changes.
- AWS provisions the S3 bucket.
Step-by-Step Guide to deploy an aws s3 bucket using HCP Terraform
Prerequisites
Before starting, make sure you have:
- AWS Account
- GitHub Account
- HCP Terraform Account
- Basic knowledge of Terraform syntax
Step 1: Create a GitHub Repository
- Log in to GitHub.
- Create a new repository (e.g.,
terraform-s3-demo). - Clone it locally:
git clone https://github.com/your-username/terraform-s3-demo.git
cd terraform-s3-demo
Step:2 Write Terraform Configuration
Create the following files:
main.tf
provider "aws" {
region = var.region
}
resource "aws_s3_bucket" "mybucket" {
bucket = var.bucket_name
tags = {
Name = var.bucket_name
Environment = var.environment
}
}
variables.tf
variable "region" {}
variable "bucket_name" {}
variable "environment" {}
Step3: Push Code to Github
git add .
git commit -m "Initial S3 bucket Terraform configuration"
git push origin main
Your Terraform code is now version-controlled.
Step 4: Set Up HCP Terraform
- Log in to HCP Terraform
- Create a new Organization
- Inside the organization, create a Project
Projects help logically group infrastructure.
Step 5: Create a VCS-Driven Workspace
- Click Create Workspace
- Select Version Control Workflow
- Connect your GitHub account
- Choose the repository (
terraform-s3-demo) - Set working directory (if needed)
- Create workspace
Now your repo is linked to HCP Terraform.
Step:6 Configure Varibales in Workspace
Inside the Workspace -> varibales section:
Add Environment Variables
AWS_ACCESS_KEY_IDAWS_SECRET_ACCESS_KEY
mark them as sensitive.
Add Terraform variables
Example:
region = ap-south-1
bucket_name = amit-terraform-demo-bucket
environment = dev
Do NOT hardcode credentials in code.
Step 7: Trigger the First Run
Now go back to GitHub and make a small change (or re-push code).
HCP Terraform will automatically:
- Clone the repository
- Run
terraform init - Run
terraform plan - Show execution plan in UI
Step 8: Review and Apply
- Review the plan output.
- Click Confirm & Apply.
- Wait for execution to complete.
If successful, your S3 bucket will be created in AWS.
Step 9: Verify in AWS Console
- Log in to AWS.
- Navigate to S3.
- Confirm the bucket is created.
Congratulations - infrastructure deployed using cloud-based Terraform workflow.
>> Secure Credential Management:
AWS credentials were added as sensitive environment variables inside the HCP Terraform workspace.
This ensures:
- No secrets in source code
- Secure execution
- Production-aligned security practice
Resource:
Github Repo: Github Repo
Hashicorp: Hashicorp
Conslusion
This project showcases how to provision AWS infrastructure using a cloud-native Terraform workflow powered by HCP Terraform and GitHub.
By combining Infrastructure as Code with automated VCS-driven execution, the deployment process becomes:
- Repeatable
- Secure
- Collaborative
- Production-ready
>> Connect With Me
If you enjoyed this post or want to follow my #30DaysOfAWSTerraformChallenge journey, feel free to connect with me here:
💼 LinkedIn: Amit Kushwaha
🐙 GitHub: Amit Kushwaha
📝 Hashnode / Amit Kushwaha
🐦 Twitter/X: Amit Kushwaha
Found this helpful? Drop a ❤️ and follow for more AWS and Terraform tutorials!

Top comments (0)