Nowadays, everything works better when using Docker for setting up local environments rather than installing PHP and other dependencies locally. When starting new PHP projects, Dockerfiles are copied from one codebase to another because each project requires similar basic dependencies to run. That makes it very difficult to maintain changes to containers.
To avoid this at TeleSoftas we came up with an idea to have a simple base image, which would simplify the development and maintainability of our projects. This image is published on DockerHub, which makes starting a new project a breeze by reducing the amount of code required.
Let’s take a look at the basic PHP image Dockerfile required to run a Symfony application:
FROM php:8.1-fpm-alpine RUN addgroup -g 1000 -S app && \ adduser -u 1000 -S app -G app RUN apk add \ autoconf \ build-base \ git RUN docker-php-ext-install pdo_mysql RUN pecl install apcu-5.1.21 RUN docker-php-ext-enable apcu opcache COPY --from=composer:2.1 /usr/bin/composer /usr/local/bin/composer
Not only that, but this simple configuration also works for plain PHP projects as well as Laravel. Copying the same Dockerfile and maintaining it through all the projects is getting a bit hard and messy, so why not make it simpler?
Publishing the image to Dockerhub
To avoid creating a new Dockerfile for each project, you can publish it to DockerHub and pull it via docker-compose.yml directly from it. And building the image is as simple as a few commands - log in to your DockerHub account, build the image and push it.
docker login -u <YOUR_DOCKERHUB_USERNAME> -p <YOUR_DOCKERHUB_TOKEN> docker.io docker build -t index.docker.io/<YOUR_DOCKERHUB_USERNAME>/php:8.1 . docker push index.docker.io/<YOUR_DOCKERHUB_USERNAME>/php:8.1
Using the newly built image
First, create a docker-compose.yml file with the following service configuration:
version: '3.7' services: php: image: <YOUR_DOCKERHUB_USERNAME>/php:8.1 user: app volumes: - ./app:/srv/app
Now you can simply ignore building the image locally, and it will be pulled from DockerHub. This way you avoid the Dockerfile duplication across projects and make it easy to maintain.
This is useful when creating a new Symfony/Laravel project from scratch or, for example, executing some PHP code locally, because Docker is more popular nowadays than having all the libraries on your host machine.
Extending Dockerfile per project
Sometimes you need some dependencies in one project and don’t need them in another. That is also easy to do using the newly published image. For example, on some projects, you might need XDebug or some other additional libraries when developing locally. To solve this, you can create a small Dockerfile, which will pull the base image and add the necessary libraries on top of it.
FROM <YOUR_DOCKERHUB_USERNAME>/php:8.1 RUN pecl install xdebug-3.1.1 RUN docker-php-ext-enable xdebug
Next, expand your docker-compose.yml configuration:
version: '3.7' services: php: build: . user: app volumes: - ./app:/srv/app
And it is as simple as this. There is no code duplication and basic dependencies are easy to maintain across all the projects. You can add as many or as few customizations to this local Dockerfile as you need.
Automating publishing to Dockerhub
Let's say you need to bump Composer to a newer version in your base image. To do this, you have to update our Dockerfile, log in to DockerHub, build the image and publish it. You can simplify this by automating your pipeline's building and publishing stages.
For this tutorial, I've chosen GitLab as a source code repository and created a .gitlab-ci.yml file with the following configuration to automate the whole process
stages: - build variables: IMAGE_VERSION: '8.1' build: stage: build image: docker:latest before_script: - docker login -u "$CI_REGISTRY_USER" -p "$CI_REGISTRY_PASSWORD" "$CI_REGISTRY" script: - docker build --pull -t $CI_REGISTRY_IMAGE:$IMAGE_VERSION . - docker push $CI_REGISTRY_IMAGE:$IMAGE_VERSION
Only one step is left - define the variables in the project’s CI/CD settings:
CI_REGISTRY=docker.io CI_REGISTRY_IMAGE=index.docker.io/<YOUR_DOCKERHUB_USERNAME>/php CI_REGISTRY_USER=<YOUR_DOCKERHUB_USERNAME> CI_REGISTRY_PASSWORD=<YOUR_DOCKERHUB_TOKEN>
When you push the updated image to GitLab, the image will be built and pushed to DockerHub once the pipeline finishes. This configuration can be easily adapted to Bitbucket or Github by following their pipeline setup guides.
There is only one big step to take - decide what is required in your base image, build it, and publish it to DockerHub. And that’s what can be done:
- Create two repositories for PHP 8.0 and 8.1 Docker images.
- Create the Dockerfile and add CI configuration to automate the whole process.
- Tag the image and push it to the DockerHub image registry once the pipeline finishes. And that’s it. That’s how you can simplify your development quite a bit.
This benefits not only existing projects but new ones as well. It makes it easy to maintain all of the applications - if a new library or package is required for the latest version of Symfony to work, just update the base image one time and already have it available in all projects.
On top of the previously mentioned improvements, it is possible to simplify the whole development stack configuration even more. This example is for PHP, but the same process can be adapted to other containers, such as Nginx. For example, there might be a requirement to reduce logging in most projects. To accomplish this you can create a new Nginx base image with log exclusion and reuse it.
Top comments (2)
Thanks for sharing a well-written article. I often thought about proposing the same at our company, but I always decided against it. The complexity (registry, versions, dependencies, …) is holding me back. I like the simplicity of having »everything« inside the Dockerfiles within the code repository.