It's Day 11 and 12, because honestly, there's just too much good stuff here and it needed two full days of the AWS 30 days Challenge, and I'm not gonna lie, I almost skipped this topic. "Functions? That sounds familiar," I thought. "I'll just Google them when I need them."
Turns out, Terraform's built-in functions are like discovering your car has turbo boost after driving it in first gear for months. These aren't just "nice-to-haves"—they're the difference between writing 500 lines of hacky code and writing 50 lines of elegant, production-ready infrastructure.
Hey, I'm learning terraform within 30 days. Or at least I'm trying to. You too can join the challenge.
Part one:
Part two:
I spent two full days working through 12 hands-on assignments covering every most used function category, and my mind was blown multiple times. Here's what I learned (and why you absolutely need to know this stuff).
Why Functions Matter: A Reality Check
Before diving in, let me paint a picture. You're trying to:
- Name an S3 bucket, but AWS won't accept uppercase letters or underscores
- Merge tags from three different sources without overwriting important values
- Split a comma-separated string of ports into actual security group rules
- Validate that instance types match your company's naming convention
- Format timestamps so your backup tags don't look like garbage
Without functions? You're copying Stack Overflow code, praying it works, and crying into your keyboard at 2 AM.
With functions? You write one line that does exactly what you need. Clean. Elegant. Chef's kiss.
The Terraform Console: My New Best Friend
First things first, before you touch main.tf, you need to know about terraform console. It's an interactive shell where you can test functions without deploying anything.
terraform console
Then you can just... play:
> lower("HELLO WORLD")
"hello world"
> max(5, 12, 9)
12
> split(",", "80,443,8080")
[
"80",
"443",
"8080",
]
> reverse(["a", "b", "c"])
[
"c",
"b",
"a",
]
This is game-changing. No more "deploy and pray." You test your logic in the console, then drop it into your config.
The 12 Assignments: My Greatest Hits
I worked through 12 practical assignments, each focusing on possible cloud ops scenarios. Here are the ones that completely changed how I think about Terraform:
Assignment 1: Project Naming (String Functions)
The Problem: You have "Project ALPHA Resource" but AWS resources need lowercase, hyphenated names like "project-alpha-resource".
The Solution:
locals {
clean_name = lower(replace(var.project_name, " ", "-"))
}
resource "aws_resourcegroups_group" "project" {
name = local.clean_name
}
What I learned: lower() and replace() are your bread and butter. You'll use these constantly for naming conventions.
Assignment 3: S3 Bucket Naming (String Acrobatics)
This one was chef's kiss because S3 buckets are picky. They want:
- Lowercase only
- No underscores
- No special characters
- 3-63 characters max
The Problem: Your input is "My_Super-Awesome-Bucket!!!!" (yeah, users are wild).
The Solution:
locals {
sanitized_bucket_name = substr(
lower(
replace(
replace(var.bucket_name_input, "_", "-"),
"/[^a-z0-9-]/", ""
)
),
0, 63
)
}
What I learned: Functions nest beautifully. Read from inside out, replace underscores, strip invalid chars, lowercase everything, then truncate. One expression solves it all.
Assignment 4: Security Group Ports (Collections Are Magic)
The Problem: Your variable is a string: "80,443,8080". You need actual security group ingress rules.
Without functions:
# Manually create three ingress blocks
# Copy-paste hell
# Cry when you need to add port 3000
With functions:
variable "allowed_ports" {
default = "80,443,8080"
}
locals {
ports_list = split(",", var.allowed_ports)
}
resource "aws_security_group" "web" {
name = "web-sg"
dynamic "ingress" {
for_each = local.ports_list
content {
from_port = tonumber(ingress.value)
to_port = tonumber(ingress.value)
protocol = "tcp"
cidr_blocks = ["0.0.0.0/0"]
}
}
}
What I learned: split() turns strings into lists. Combine it with dynamic blocks (from yesterday!), and you've got data-driven security groups. Add a port? Change the variable. Done.
Assignment 5: Environment Lookup (The Decision Maker)
The Problem: Different environments need different instance sizes. Dev gets t3.micro, prod gets t3.large.
The Old Way: Multiple terraform.tfvars files, or worse, separate directories per environment.
The Function Way:
variable "environment" {
default = "dev"
}
locals {
instance_types = {
dev = "t3.micro"
staging = "t3.small"
prod = "t3.large"
}
selected_type = lookup(local.instance_types, var.environment, "t3.micro")
}
resource "aws_instance" "app" {
ami = var.ami_id
instance_type = local.selected_type
}
What I learned: lookup() is conditional expressions on steroids. Give it a map and a key, it returns the value. Give it a fallback, and you've got safety built in. One config, infinite environments.
Assignment 6: Instance Validation (The Validator)
This one blew my mind. You can validate inputs before deploying.
The Problem: Your company requires instance types to match pattern: t3.* or t2.* only.
The Solution:
variable "instance_type" {
type = string
validation {
condition = can(regex("^(t2|t3)\\.", var.instance_type))
error_message = "Instance type must start with 't2.' or 't3.'"
}
}
What I learned:
-
can()tries an expression and returns true/false instead of erroring -
regex()validates patterns -
length()checks counts - Combine them in validation blocks to catch mistakes before you deploy
No more "oops, I deployed a c5.24xlarge in dev and my AWS bill is now a mortgage payment."
Assignment 7: Backup Configuration (Sensitive Data)
The Problem: You're handling database passwords and backup configs. You need validation AND security.
The Solution:
variable "backup_name" {
type = string
validation {
condition = endswith(var.backup_name, "-backup")
error_message = "Backup name must end with '-backup'"
}
}
variable "db_password" {
type = string
sensitive = true
}
output "backup_info" {
value = {
name = var.backup_name
password = sensitive(var.db_password)
}
sensitive = true
}
What I learned:
-
endswith()(andstartswith()) validate naming conventions -
sensitive()marks values so they don't appear in logs or console output - Terraform respects your secrets if you tell it to
Assignment 10: Cost Calculation (Math That Actually Works)
The Problem: You have monthly costs and credits. You need to calculate actual bills.
variable "monthly_costs" {
default = [120.50, 85.00, 200.75, 50.00]
}
variable "credit" {
default = -50.00 # Negative credit
}
locals {
total_cost = sum(var.monthly_costs)
actual_credit = abs(var.credit)
final_cost = max(local.total_cost - local.actual_credit, 0)
highest_cost = max(var.monthly_costs...)
}
output "cost_summary" {
value = {
total_before_credit = local.total_cost
credit_applied = local.actual_credit
final_bill = local.final_cost
highest_single_cost = local.highest_cost
}
}
What I learned:
-
sum()adds all list elements -
abs()converts negatives to positives -
max()finds the highest value (or ensures non-negative results) - The
...operator unpacks lists
Math in Terraform actually makes sense!
Assignment 11: Timestamp Management (Time Flies)
The Problem: You need timestamps for backup tags and resource naming.
locals {
current_time = timestamp()
formatted_tags = {
CreatedAt = formatdate("YYYY-MM-DD hh:mm:ss ZZZ", local.current_time)
BackupDate = formatdate("YYYY-MM-DD", local.current_time)
Year = formatdate("YYYY", local.current_time)
}
}
resource "aws_s3_bucket" "backups" {
bucket = "backups-${formatdate("YYYY-MM-DD", timestamp())}"
tags = local.formatted_tags
}
What I learned:
-
timestamp()gets current UTC time -
formatdate()formats it however you want - Perfect for backup naming and audit tags
Assignment 12: File Content Handling (The JSON Master)
This was the final boss—reading JSON configs and storing them in AWS Secrets Manager.
The Problem: You have a config file with database credentials. You need to read it and store it securely.
The Solution:
locals {
# Check if file exists
config_exists = fileexists("${path.module}/config/database.json")
# Read and parse JSON
db_config = jsondecode(file("${path.module}/config/database.json"))
}
resource "aws_secretsmanager_secret" "db_credentials" {
name = "database-credentials"
}
resource "aws_secretsmanager_secret_version" "db_credentials" {
secret_id = aws_secretsmanager_secret.db_credentials.id
secret_string = jsonencode(local.db_config)
}
What I learned:
-
fileexists()checks before reading (no crashes!) -
file()reads file contents -
jsondecode()parses JSON into Terraform objects -
jsonencode()converts back to JSON strings - You can manage external configs elegantly
The Function Categories That Matter Most
After working through everything, here's my personal ranking of function categories by usefulness:
Tier 1 (Use Daily):
-
String Functions:
lower(),upper(),replace(),trim(),split(),join() -
Collection Functions:
length(),concat(),merge(),toset() -
Lookup Functions:
lookup(),element()
Tier 2 (Use less frequently):
-
Validation Functions:
can(),regex(),contains() -
Type Conversion:
tonumber(),tostring(),tolist() -
Numeric Functions:
max(),min(),sum()
Tier 3 (Use When Needed):
-
File Functions:
file(),fileexists(),dirname() -
Date/Time Functions:
timestamp(),formatdate() -
Encoding Functions:
jsondecode(),jsonencode(),base64encode()
The "Aha!" Moment
The real breakthrough came when I realized: Terraform functions turn static configs into intelligent, adaptive infrastructure.
Before functions, all my configs were rigid from day01. If I needed a change, I edited code. If I needed environment differences, I copied files. It was Infrastructure as Code, but it was dumb code.
After understanding terraform functions, my configs are becoming smart. They validate inputs, handle edge cases, adapt to environments, and gracefully handle changes. This is Infrastructure as Intelligent Code.
What's Next?
Tomorrow (Day 13), I'm diving into Terraform Data Sources and honestly, after learning functions, this is perfect timing. Data sources let you query existing infrastructure (like "what VPCs do I already have?" or "what's the latest AMI?"), and now that I know how to manipulate and validate data with functions, I can actually do something intelligent with those queries.
Also, I'm starting to see how all these concepts connect:
- Day 9 (Lifecycle): Control how resources change
- Day 10 (Expressions): Make configs adaptive
- Day 11-12 (Functions): Make configs intelligent
- Day 13 (Data Sources): Query and integrate existing infrastructure
This is all building toward something bigger, and I'm here for it.
See yaa! (And seriously, open that Terraform console and just... play. You'll thank me.)
Top comments (1)
Thank you for reading