Ansible is one of the most powerful automation tools used by DevOps engineers to manage infrastructure, configure servers, and deploy applications at scale. In this blog, I'll walk through how I automated Linux and Windows server setup using Ansible — specifically around SSH configuration, roles, and playbooks — and reduced repetitive manual work significantly.
This project demonstrates how automation can simplify server management while improving consistency and deployment speed across both Linux and Windows environments.
**
Why I Chose Ansible for This
**
Before I started using Ansible, server setup meant logging into each machine manually, running the same commands over and over, and hoping nothing was missed. One wrong step and the configuration was inconsistent across environments. Sound familiar?
What drew me to Ansible was its simplicity — no agents to install, no complex setup. Just SSH into Linux, WinRM into Windows, and you're managing your entire fleet from a single control node. That agentless architecture was a game changer for me.
Project Overview
The goal was straightforward: automate the initial setup of both Linux and Windows servers using a clean, reusable Ansible structure. Here's what I set out to configure:
- SSH hardening and configuration on Linux servers
- WinRM configuration to enable Ansible to talk to Windows servers
- Reusable roles for both OS types
- A master playbook to tie everything together.
Setting Up the Project Structure
The first thing I did was organize everything into roles. Roles keep your code clean, reusable, and easy to share across projects. Here's the structure I used:
ansible-server-setup/
├── inventory/
│ ├── hosts.yml
├── roles/
│ ├── linux_ssh/
│ │ ├── tasks/main.yml
│ │ ├── templates/sshd_config.j2
│ │ ├── handlers/main.yml
│ │ └── defaults/main.yml
│ ├── windows_setup/
│ │ ├── tasks/main.yml
│ │ └── defaults/main.yml
├── playbooks/
│ ├── linux_setup.yml
│ ├── windows_setup.yml
└── site.yml
Keeping Linux and Windows roles separate made everything much easier to maintain and debug independently.
The Inventory
I defined both Linux and Windows hosts in a single YAML inventory, with the right connection settings for each:
all:
children:
linux_servers:
hosts:
linux-01:
ansible_host: 10.0.1.10
ansible_user: ec2-user
ansible_ssh_private_key_file: ~/.ssh/id_rsa
windows_servers:
hosts:
win-01:
ansible_host: 10.0.2.10
ansible_user: Administrator
ansible_password: "{{ vault_win_password }}"
ansible_connection: winrm
ansible_winrm_transport: ntlm
ansible_port: 5985
Notice I vaulted the Windows password using ansible-vault — never hardcode credentials in plain text. Learned that lesson early!
SSH Hardening Role for Linux
This was the core of my Linux setup. The linux_ssh role handles SSH daemon configuration to make servers secure and consistent from day one.
roles/linux_ssh/defaults/main.yml
ssh_port: 22
permit_root_login: "no"
password_authentication: "no"
max_auth_tries: 3
roles/linux_ssh/templates/sshd_config.j2
Port {{ ssh_port }}
PermitRootLogin {{ permit_root_login }}
PasswordAuthentication {{ password_authentication }}
MaxAuthTries {{ max_auth_tries }}
PubkeyAuthentication yes
AuthorizedKeysFile .ssh/authorized_keys
name: Deploy SSH configuration
ansible.builtin.template:
src: sshd_config.j2
dest: /etc/ssh/sshd_config
owner: root
group: root
mode: "0600"
notify: Restart SSHname: Ensure SSH service is running and enabled
ansible.builtin.service:
name: sshd
state: started
enabled: true
roles/linux_ssh/handlers/main.yml
- name: Restart SSH ansible.builtin.service: name: sshd state: restarted
The handler ensures SSH only restarts when the config actually changes — not on every run. That small detail matters a lot in production.
Top comments (0)