Today was one of those days where everything finally clicked. Most of the confusion I had around Terraform files, structure, and hidden configs disappeared once I understood how Terraform actually loads .tf files and why project organization matters.
Below is my full breakdown of what I learned and the exact commands and files I worked with.
What I Learned Today
How Terraform Loads .tf Files
Terraform doesn’t care about the order of the .tf files.
It scans the entire working directory and automatically loads every .tf file before running planning or applying.
That means:
main.tfbackend.tfprovider.tfvariable.tflocals.tfoutput.tf
…all get merged by Terraform internally during processing.
This is why a clean structure matters not for Terraform, but for your sanity as the human reading the code.
Why Project Structure Matters
I finally understand why Terraform setups use separate files.
A simple structure like:
main.tf
backend.tf
providers.tf
variables.tf
locals.tf
output.tf
keeps everything clear:
- Resources in main.tf
- Backend in backend.tf
- Variables in variable.tf
- Local values in locals.tf
- Output values in output.tf
- Provider config in provider.tf
It’s not required by Terraform, but it makes the project clean, predictable, and scalable.
Using .gitignore to Protect Sensitive Files
This one is crucial.
I learned how to use .gitignore to hide files that should NEVER be pushed to GitHub like state files, logs, or crash reports.
My .gitignore:
.terraform.lock.hcl
terraform.tfvars
.terraform/
*.tfvars.json
*.log
crash.log
*.terraform.*backup
*.tfstate
*.tfstate.backup
This prevents accidental exposure of secrets or cloud infrastructure data.
Commands I Used Today
These are the exact commands I worked with:
terraform validate:
checks for syntax errors.terraform fmt -recursive:
formats every.tffile in the project.terraform plan:
shows the expected infrastructure changes.
Very basic commands, but they form the foundation of clean Terraform workflow.
The Full Terraform Project Structure I Built Today
main.tf
# Create s3 bucket
resource "aws_s3_bucket" "first_bucket" {
bucket = local.bucket_name
tags = {
Name = local.bucket_name
Environment = var.environment
}
}
# Create a vpc
resource "aws_vpc" "sample" {
cidr_block = "10.0.1.0/24"
region = var.region
tags = {
Environment = var.environment
Name = local.vpc_name
}
}
resource "aws_instance" "example" {
ami = "resolve:ssm:/aws/service/ami-amazon-linux-latest/al2023-ami-kernel-default-x86_64"
instance_type = "t3.micro"
region = var.region
tags = {
Environment = var.environment
Name = "${var.environment}-EC2-Instance"
}
}
backend.tf
terraform {
backend "s3" {
bucket = "devopswithzacks-terraform-state"
key = "dev/terraform.tfstate"
region = "us-east-1"
encrypt = true
use_lockfile = true
}
}
locals.tf
locals {
bucket_name = "${var.channel_name}-bucket-${var.environment}-${var.region}"
vpc_name = "${var.environment}-VPC"
}
output.tf
output "vpc_id" {
value = aws_vpc.sample.id
}
output "ec2_id" {
value = aws_instance.example.id
}
provider.tf
terraform {
required_providers {
aws = {
source = "hashicorp/aws"
version = "~> 6.0"
}
}
}
provider "aws" {
region = "us-east-1"
}
variables.tf
variable "environment" {
default = "dev"
}
variable "channel_name" {
default = "dwz"
}
variable "region" {
default = "us-east-1"
}
.gitignore
.terraform.lock.hcl
terraform.tfvars
.terraform/
*.tfvars.json
*.log
crash.log
*.terraform.*backup
*.tfstate
*.tfstate.backup
Summary
Day 6 was all about fundamentals:
- Understanding how Terraform loads files.
- Seeing the importance of clean project structure.
- Using
.gitignoreto protect sensitive data. - Running essential validation and formatting commands.
- Splitting Terraform files into clean, readable units.
Nothing fancy just solid engineering practice.
Top comments (0)