For years, I used a messy MAMP setup. All my projects were several sub-directories down from the root htdocs folder. A few months ago, I took the time to clean it all up. Now I have Docker containers for all my web-based projects. Though the learning curve was steep, I now have all these projects in their own little isolated sandboxes. Previously, messing something up on my development machine could have some painful consequences (regular backups are a life saver!). Now, if I muck something up in a container or image, I can just blast it away and fire-up a new one.
I have a few custom images that I use for WordPress development among other things. I keep them in a private Git repository, so that I always have access regardless of the dev machine that I'm on.
I actually wrote about my experience setting up a Docker container just for running WordPress tests with PHPUnit, here on dev.to.
I'm interested to see how many other folks in the community have a Dockerized workspace.
Could you tell me more about your setup and how you learned Docker? I currently have a messy MAMP setup myself.
Sure! Anything specific you'd like to know?
My setup used to look like this:
other/ # Here I'd have anything that just needed to be on a web server.
forest-theme/ # Here's a sub-installation of WordPress for my portfolio theme.
plugins/ # Build/deployment scripts sat in the root of the plugins folder.
error.log # Yep...one PHP error log for all my sites. o.0
~/Documents/Programming/ # Here i'd have everything else that wasn't in need of a web
Now, it looks like this:
# Projects are organized by language/platform. I omitted most of it. There's a lot.
_dockerc-repos/ # Contains reusable custom Docker Compose files.
docker-compose.yml # Just a simple apache server here.
docker-compose.yml # Custom WordPress setup with PHP, WP-CLI, and MySQL.
logs/ # All these volumes are mounted to the docker container, to
mysql/ # maintain state.
Wordpress/ # Only contains SVN repositories for my plugins on the WordPress plugin
I learned via trial and error. Installing Docker itself is straightforward. I think the most confusing part was understanding the capabilities of the Docker containers. That and the commands. Quite a few tutorials that I found used the docker run command which requires the manual entry of arguments every time you want to turn on a docker container. Once I found Docker Compose, it was much easier.
Docker compose lets you set all your configurations ahead of time in a docker-compose.yml file (the documentation for Docker Compose isn't bad either). Then, all you have to do is run the docker-compose up command to fire up a container and docker-compose down to bring it down.
Sweet setup! I'm still new to Wordpress actually, do you have a Docker container for each Wordpress project you have or are they all on the same and you have maybe one container by type of project ?
If you have one per Worpress project, that should take up a lot of disk space, right?
Yes, I have a Docker container for every project. My Docker workspace (including everything) is 3.2G, and my WordPress workspace takes up 2.2G of that. So, it doesn't take up much space.
That's actually not that bad, I'll give it a shot. Thanks!
Quick question if you don't mind, do you use a tool in particular when you mount your directories to the container, or do you just use the -v command ?
I use docker-compose which allows you to set up docker container configurations in a docker-compose.yml file. Volume mounting is described in the docker-compose file documentation.
It's an alternative to using the base docker command and using -v every time. I don't like having to remember what options I used for a custom docker container, so for me, slapping it into a config file and running docker-compose up or docker-compose down is a huge benefit.
To make my personal configurations easy to install across machines, I create a simple bash script that I can execute it directly from github.
To replicate my development environment, I use Docker and/or Vagrant.
Wasn't vagrant a software specialising on exactly this kind of task?
sometimes you need to run a configuration script related to the app not the environment (for example running composer install for php project or npm install for node) so you put these scripts in a bash file to run after provisioning.
yes, it is :)
I use VirtualBox running on Windows and have just started experimenting with Linux as the host operating system (so I can run non-Intel architectures like ARM). I divide the virtual machines into three virtual disks: one 32GB for the main os and tools, one 16GB swap disk, and one (currently) 190GB /home file system that I share between various flavors/versions of Linux (Fedora and Ubuntu, mostly). I keep my home directory environment files (.bash_login, .bashrc, etc.) in git repositories on Bitbucket (free private repositories) and Gitlab (best public repo site for flexibility in structuring your project git repositories). Every once in a while I clone my /home filesystem mostly as a backup of my local clones of Bitbucket/Gitlab project repositories.
Haven't tried Docker yet, but it looks interesting, especially for setting up test environments.
Let's try Docker, you will see it's so powerful and you can change env depending of the project you work on.
I'm working on PHP 5.4 project and a Symfony Flex that require 7.2 and there's no problem switching of docker. But changing PHP version in WAMP or XAMPP isn't easy. And stop a software to launch another one is boring.
I need to reinstall my work laptop to Ubuntu 16.04 so I'm playing with Ansible and vagrant at the minute.
I'm looking at my current setup and trying to put it into a playbook which I use to provision a vm using vagrant for testing.
I plan to run the playbook on my freshly installed laptop.
Dotfiles are stored in github.
By following those guidelines when it comes time to get up and running on a new machine I install the latest version of a few pieces of software. Anything else the project needs it installs itself through packages when it builds the first time.
I can pickup and continue on any of my 3 workstations (work, home, mobile) as everything I work on, not just code, is either cloud-based or checked in to a repository available online.
When you go docker you never look back.
Example of local mongoDB + meteor express (kind of phpmyadmin)
As for config files depends, SCP or Gists.
I work in Java/J2EE, Python, Android, PHP (for fun)
Pretty much what I do too. I do use the brew package bundle to handle all my apps
I have shell scripts the place a few settings files and stuff that I keep on GitHub.
That and time machine have done okay so far.
Before used vagrant and now docker both are good but I prefer vagrant.
If you prefer vagrant, why are you now using docker?
because at my work we moved to docker, that's why
Honest question on why you could want to be able to do this, except for backup/restore purposes.
Sure! A few reasons:
Working in C++ on Mac and Windows, I use Visual Studio or Xcode and SourceTree as git client. All with default settings, so getting a new system set up takes very little time.
We're a place where coders share, stay up-to-date and grow their careers.
We strive for transparency and don't collect excess data.