What do you use to replicate your dev environment?

bhilburn profile image Ben Hilburn ・1 min read

There are a variety of tools and frameworks available for replicating your development environment & personal configurations across machines and instances. There's also the manual method of hand-installing everything and perhaps keeping your dotfiles in a git repository that you pull and symlink from.

So, what is your personal method for replicating your development environment? Is it cross-platform? Is it limited to specific languages? I'm interested to hear what solutions folks have liked best.

(And no, only having your DE on one machine and sshfs-mounting remote filesystems for local editing doesn't count ;) )


markdown guide

For years, I used a messy MAMP setup. All my projects were several sub-directories down from the root htdocs folder. A few months ago, I took the time to clean it all up. Now I have Docker containers for all my web-based projects. Though the learning curve was steep, I now have all these projects in their own little isolated sandboxes. Previously, messing something up on my development machine could have some painful consequences (regular backups are a life saver!). Now, if I muck something up in a container or image, I can just blast it away and fire-up a new one.

I have a few custom images that I use for WordPress development among other things. I keep them in a private Git repository, so that I always have access regardless of the dev machine that I'm on.

I actually wrote about my experience setting up a Docker container just for running WordPress tests with PHPUnit, here on dev.to.

I'm interested to see how many other folks in the community have a Dockerized workspace.


Could you tell me more about your setup and how you learned Docker? I currently have a messy MAMP setup myself.


Sure! Anything specific you'd like to know?

My setup used to look like this:

        other/ # Here I'd have anything that just needed to be on a web server.
            forest-theme/ # Here's a sub-installation of WordPress for my portfolio theme.
            # etc.
            plugins/ # Build/deployment scripts sat in the root of the plugins folder.
            # etc.
            error.log # Yep...one PHP error log for all my sites. o.0
~/Documents/Programming/ # Here i'd have everything else that wasn't in need of a web 
                         # server.

Now, it looks like this:

    # Projects are organized by language/platform. I omitted most of it. There's a lot.
        _dockerc-repos/ # Contains reusable custom Docker Compose files.
                docker-compose.yml # Just a simple apache server here.
            # etc.
                docker-compose.yml # Custom WordPress setup with PHP, WP-CLI, and MySQL.
                logs/         # All these volumes are mounted to the docker container, to
                mysql/        # maintain state.
                plugins/      #
                themes/       #
                wp-admin/     #
                wp-includes/  #
                wp-config.php #
                # etc.
            # etc.
    Wordpress/ # Only contains SVN repositories for my plugins on the WordPress plugin
               # directory.

I learned via trial and error. Installing Docker itself is straightforward. I think the most confusing part was understanding the capabilities of the Docker containers. That and the commands. Quite a few tutorials that I found used the docker run command which requires the manual entry of arguments every time you want to turn on a docker container. Once I found Docker Compose, it was much easier.

Docker compose lets you set all your configurations ahead of time in a docker-compose.yml file (the documentation for Docker Compose isn't bad either). Then, all you have to do is run the docker-compose up command to fire up a container and docker-compose down to bring it down.

Sweet setup! I'm still new to Wordpress actually, do you have a Docker container for each Wordpress project you have or are they all on the same and you have maybe one container by type of project ?

If you have one per Worpress project, that should take up a lot of disk space, right?


Yes, I have a Docker container for every project. My Docker workspace (including everything) is 3.2G, and my WordPress workspace takes up 2.2G of that. So, it doesn't take up much space.

That's actually not that bad, I'll give it a shot. Thanks!

Quick question if you don't mind, do you use a tool in particular when you mount your directories to the container, or do you just use the -v command ?

I use docker-compose which allows you to set up docker container configurations in a docker-compose.yml file. Volume mounting is described in the docker-compose file documentation.

It's an alternative to using the base docker command and using -v every time. I don't like having to remember what options I used for a custom docker container, so for me, slapping it into a config file and running docker-compose up or docker-compose down is a huge benefit.


To make my personal configurations easy to install across machines, I create a simple bash script that I can execute it directly from github.

To replicate my development environment, I use Docker and/or Vagrant.


Wasn't vagrant a software specialising on exactly this kind of task?


sometimes you need to run a configuration script related to the app not the environment (for example running composer install for php project or npm install for node) so you put these scripts in a bash file to run after provisioning.


When you go docker you never look back.
Example of local mongoDB + meteor express (kind of phpmyadmin)

As for config files depends, SCP or Gists.

  1. Embrace defaults, minimize customization
  2. Keep everything updated
  3. Use web-based services when possible
  4. Use subscriptions when possible

By following those guidelines when it comes time to get up and running on a new machine I install the latest version of a few pieces of software. Anything else the project needs it installs itself through packages when it builds the first time.

I can pickup and continue on any of my 3 workstations (work, home, mobile) as everything I work on, not just code, is either cloud-based or checked in to a repository available online.


I use VirtualBox running on Windows and have just started experimenting with Linux as the host operating system (so I can run non-Intel architectures like ARM). I divide the virtual machines into three virtual disks: one 32GB for the main os and tools, one 16GB swap disk, and one (currently) 190GB /home file system that I share between various flavors/versions of Linux (Fedora and Ubuntu, mostly). I keep my home directory environment files (.bash_login, .bashrc, etc.) in git repositories on Bitbucket (free private repositories) and Gitlab (best public repo site for flexibility in structuring your project git repositories). Every once in a while I clone my /home filesystem mostly as a backup of my local clones of Bitbucket/Gitlab project repositories.

Haven't tried Docker yet, but it looks interesting, especially for setting up test environments.


Let's try Docker, you will see it's so powerful and you can change env depending of the project you work on.

I'm working on PHP 5.4 project and a Symfony Flex that require 7.2 and there's no problem switching of docker. But changing PHP version in WAMP or XAMPP isn't easy. And stop a software to launch another one is boring.


I need to reinstall my work laptop to Ubuntu 16.04 so I'm playing with Ansible and vagrant at the minute.
I'm looking at my current setup and trying to put it into a playbook which I use to provision a vm using vagrant for testing.
I plan to run the playbook on my freshly installed laptop.
Dotfiles are stored in github.

  1. I have a Mac so all my tools and software is installed using homebrew and cask (for GUIs). I backup the software using brew list >> backup_file.txt and brew cask list >> backup_cask_file.txt - to setup again I just need a fat Internet pipe. Last time I did it was 2 hours on a 1GBps in San Francisco.
  2. I keep all my files in Dropbox (personal), Box (work), One Drive (backup) and Google Drive (legacy)
  3. I do not have a specific IDE setup - I tend to use the defaults...
  4. All code is committed to GitHub (open source), Bitbucket (work), GitLab (personal) as may be needed

I work in Java/J2EE, Python, Android, PHP (for fun)


Before used vagrant and now docker both are good but I prefer vagrant.


If you prefer vagrant, why are you now using docker?


because at my work we moved to docker, that's why


Honest question on why you could want to be able to do this, except for backup/restore purposes.


Sure! A few reasons:

  1. It's a pain to re-setup your dev environment every time you need to work on a new system. For many, this isn't a problem as you are doing everything locally or on one centralized system. Not everyone has the luxury of this workflow, though. It gets even more complicated if you work on many different OSes.
  2. As you said, backup & restore purposes - but this doesn't just mean if your hdd crashes, for example. Occasionally I'll update all of my ViM plugins, for example, and suddenly everything is broken. Instead of stepping through everything, it would be very handy to be able to treat an update of the entire dev environment as an atomic action that I can roll-back.
  3. Lastly, it's much easier to understand the software dependencies of your dev environment if you have it all managed / contained with something.

Working in C++ on Mac and Windows, I use Visual Studio or Xcode and SourceTree as git client. All with default settings, so getting a new system set up takes very little time.