DEV Community

Cover image for Terraform Meets Ansible: Automating Multi-Environment Infrastructure on AWS

Terraform Meets Ansible: Automating Multi-Environment Infrastructure on AWS

🚀 Introduction

Welcome Devs, to the world of Cloud and Code ☁️💻

Today, I’ve got something really exciting for you. We’re going to integrate Terraform with Ansible to showcase the true power of Infrastructure as Code (IaC) and Configuration Management — all fully automated with a multi-environment setup.

This setup will give you a real-world glimpse of how modern DevOps projects operate with environments like Dev, Stage, and Prod, and how tools like Terraform and Ansible work together — one handling infrastructure provisioning, and the other managing configuration.

So without further ado, let’s dive in and start building 🚀


⚙️ Pre-Requisites

Before we jump into the setup, make sure you have the following requirements ready on your system 👇

  • AWS CLI installed and configured with an IAM user that has full access to EC2 and VPC.

    🧭 If you’re new to this part — don’t worry! I’ve already covered it in one of my earlier blogs:

    👉 Getting Started with Terraform – A Beginner’s Guide

    (Follow Step 1 to 3 — it’ll help you install AWS CLI, configure your IAM user, and set up Terraform CLI as well.)

  • Python and Ansible installed on your system.

    📘 You can check out Ansible’s official installation guide here:

    👉 Install Ansible with pip

Once that’s all set up, we’re good to go and start building our project 🚀


🚀 Getting Started

Alright Devs, now that we’ve set up the basics, let’s dive into the real deal.

This project is all about provisioning infrastructure for a real-world multi-environment setup and then configuring it using Ansible — just like it’s done in production systems.

You can find the complete project code here:

👉 Terra-Projects Repository

The directory for this particular setup is terra-ansible-starter.

Now, let’s break down how the project actually works 👇

🧱 1. terra-config Directory

This directory contains the following files:

  • provider.tf

  • variable.tf

  • output.tf

  • main.tf

Here’s what happens inside:

We’re creating a key pair named tester-key, and a security group with three rules — allowing:

  • Outbound traffic on ports 80 and 22

  • Inbound traffic from everywhere

Then, using a for loop, we spin up 6 EC2 instances — two for each environment:

  • dev

  • stage

  • prod

The output.tf file gives us the public IPs of all these instances, neatly categorized per environment.

🧮 2. scripts Directory

Inside this folder, we have a Python script named generate_inv.py.

Here’s what it does:

  • It reads the output.json file (generated after running terraform output command).

  • Then, it dynamically creates a hosts.ini file inside the Ansible inventory directory.

This makes the integration between Terraform and Ansible completely automated — no manual IP editing required!

⚙️ 3. ansible Directory

Here lies our configuration magic

  • Inside this folder, there’s a playbook.yml file which defines a role called webserver.

  • The roles/webserver/tasks/main.yml file includes all configuration steps for the servers:

    • Install NGINX
    • Copy the index-{env}.html file (specific to each environment)
    • Restart the NGINX server

There’s also a files directory that contains separate index.html files for each environment (dev, stage, prod).

The inventory folder holds the hosts.ini file, which specifies the public IPs of instances — and yes, it’s dynamically created using our Python script.


🔗 Connecting It All Together

That was an eagle-eye view of the project. Now let’s zoom in and see how everything connects together.

At the heart of this automation lies the deploy.sh script — the glue that ties Terraform and Ansible into one seamless workflow.

Here’s what happens step by step 👇

🧩 Step 1: Provision Infrastructure with Terraform

First, the script navigates inside the terra-config directory and runs:

terraform init
terraform apply -auto-approve
Enter fullscreen mode Exit fullscreen mode

This initializes Terraform and provisions the infrastructure across all three environments — Dev, Stage, and Prod.

Once the resources are created, it generates an output.json file that stores the public IPs of every instance in JSON format — categorized neatly by environment.

🐍 Step 2: Generate Dynamic Inventory with Python

Next, the script moves back to the main project directory and executes the generate_inv.py script.

This Python script:

  • Reads the output.json file created by Terraform

  • Formats it properly to generate an Ansible hosts.ini file

  • Appends it with other essential details such as the SSH key path for authentication

A quick sleep 15 command ensures the EC2 instances finish their health checks before Ansible jumps in for configuration.

⚙️ Step 3: Configure Instances with Ansible

Finally, once our hosts.ini file is ready, the script triggers the Ansible playbook command:

ansible-playbook playbook.yml
Enter fullscreen mode Exit fullscreen mode

This command configures all six EC2 instances automatically — each according to its environment.

Every environment (Dev, Stage, Prod) gets its own custom index.html file served through NGINX.


🧠 Enough Talk — Let’s Get Our Hands Dirty!

Alright, enough with the theory — it’s time to get practical and actually see this automation in action 🔥

Before we launch the setup, we need an SSH key that Ansible will use to authenticate into our EC2 instances.

Run the following command to generate one:

ssh-keygen -t rsa -b 4096 -f ~/.ssh/appKey
Enter fullscreen mode Exit fullscreen mode

This command creates a secure SSH key pair named appKey inside your ~/.ssh directory.

We’ll use this private key for all our six EC2 instances — two for each environment (Dev, Stage, and Prod).

🚀 Let’s Deploy Everything

Now, since we’re DevOps engineers (and we hate doing things manually 😎), we’ll use the deploy.sh script to automate everything — from provisioning infrastructure to configuring servers.

Make sure the script has execution permissions, then run it:

chmod u+x deploy.sh
./deploy.sh
Enter fullscreen mode Exit fullscreen mode

Sit back, grab a coffee ☕, and watch the logs as the magic unfolds.

Terraform will create your instances, generate the output file, the Python script will prepare your inventory, and Ansible will jump in to configure your NGINX servers — all in one go.

🌐 Test the Setup

Once the process completes, visit the public IPs of your EC2 instances in your browser.

You’ll see different index.html pages for each environment — Dev, Stage, and Prod — each showcasing the environment name and design differences.

🧹 Clean Up

Once you’re done testing, don’t forget to destroy your infrastructure — AWS isn’t running on free coffee beans 😅

Run the following command to safely tear everything down:

cd terra-ansible-starter/terra-config
terraform destroy --auto-approve
Enter fullscreen mode Exit fullscreen mode

This will clean up all EC2 instances, security groups, and key pairs — saving you from unwanted AWS charges 💸


🎯 Conclusion

And that’s a wrap, folks! 🥳

We just walked through a complete automation pipeline where Terraform handled infrastructure provisioning, and Ansible took care of server configuration — all in a multi-environment setup.

This hands-on project gives you a solid understanding of how real-world DevOps workflows look when code, automation, and infrastructure come together.

By combining these two tools — Terraform and Ansible — you’ve essentially built a foundation for scalable, environment-aware deployments.

Whether it’s deploying static sites, configuring app servers, or scaling microservices, the same workflow logic applies — automate, version-control, and manage everything as code.

If you followed along till here, you’ve not just learned two tools —

you’ve built the mindset of a DevOps engineer who thinks in systems and automates for efficiency ⚙️

Keep exploring, keep experimenting — and as always, build, break, and learn 💪


🌐 Connect with Me

If you enjoyed this guide, share it with your DevOps buddies and stay tuned for more such projects!

You can also find me sharing tech content, tutorials, and behind-the-scenes DevOps experiments here 👇

Top comments (0)