This article was originally posted on my blog.
If you're like me, you use Github for public repositories and for private repositories owned by organizations, but for your own private repositories, you use Gitlab. Indeed, Gitlab offers free unlimited private repositories and does the job.
After having pushed a lot of commits, you decide that it could be really nice to automate your tests every time a Merge Request (yes, Merge Request. We're talking Gitlab language here) is created.
Then you think that you could use CircleCI as you do for your Github repositories, especially since you know how to use docker-compose on it. You log into your CircleCI account, go to the integration page, and realize Gitlab integration is missing. Github is there, as well as Bitbucket, but no Gitlab in the area. Shit. You can still upvote the idea of a Gitlab integration on CircleCI's website, but until they decide to change things, you're stuck.
You decide to list all the free CI services that can work with Gitlab and match your expectations. And then you realize that Gitlab offers a CI service. After having read all the not clear at all advertised features on the website (to Gitlab’s commercials: you have a nice product, we just don’t realize how cool is your offer since everything is so fuzzy on your website. It’s a shame) you start to think that maybe they offer a free CI even for people using free private repositories on the Community Edition. It seems it is limited to 2000 build minutes per month, which is frankly a very generous deal. Our question today is more precise: is it possible to use docker-compose to build the whole stack and run your tests in the CI? Yup.
First, the (very bad…) documentation says you need to enable some shared runners, but in my case it was already done by default, so I guess you don’t need to do it either. Next.
Now we just need to create a .gitlab-ci.yml
file in the root directory of our project. This is quite documented for the general use, but there is no track of docker-compose in the documentation. It seems we just need to install it as a python package (thank you Stack Overflow!) and then we can use it. We also need to use some obscure image:docker and docker:dind services. Well, if it works, why not.
image: docker
services:
- docker:dind
before_script:
- apk add --no-cache py-pip
- pip install docker-compose
- docker-compose build
- docker-compose up -d
# here we can install our dependencies (composer, yarn…)
tests:
script:
# launch our tests
- docker-compose exec php phpspec run --format pretty
- docker-compose exec php phpunit
- docker-compose exec php behat
Quite easy actually. I just had some issue (that I don’t have locally or on CircleCI) with the interactive TTY while executing docker-compose exec
. Maybe because the version used by pip is the one adding a regression? I dunno. Anyway, you can fix the issue by adding a -T
tag to your commands.
image: docker
services:
- docker:dind
before_script:
- apk add --no-cache py-pip
- pip install docker-compose
- docker-compose build
- docker-compose up -d
- docker-compose exec -T php composer install
tests:
script:
- docker-compose exec -T php phpspec run --format pretty
- docker-compose exec -T php phpunit
- docker-compose exec -T php behat
This should do the trick. Hope it helped. Happy testing.
Update: Actually, maybe I spoke too fast…
I have an issue with the network. Containers cannot communicate with each others… Maybe I can play with the network options (network_mode?).
If a docker expert is here, I would love some help :)
Image by Paradasos (CC by/nc)
Top comments (9)
You can probably actually use a PHP image instead of the docker image - save yourself the effort of building a PHP image every time.
You should also look at stages and jobs to separate your spec tests from your unit tests and from your end to end tests. docs.gitlab.com/ee/ci/pipelines.html
Make sure you specify your image in each job rather than globally, so you don't cause yourself problems later down the line.
Hi Stephen, thanks for your comment!
I separated first my build in 3 builds (unit, specs, e2e): I thought it would do the before_script once, but it launches 3 different builds with the same before_script. In the end it seems the overall speed was quite negligable (especially if you put your specs and unit tests before e2e tests) and I was thinking that it would take me some build time (3 x 1 minute instead of 1 x 1 minute) so I ended doing only one build.
Am I wrong in my logic here? Isn't it counted as 3 x build time?
So yes, with a before script, you will build on each job. However, your before is simply that you are building a php container. For that you can probably reuse an existing php docker image from docker hub, rather than build each time. When you apply that to your jobs rather than as a before script gitlab will cache the image and reuse rather than pull each time.
You can also do things with passing artefacts between jobs.
Separating your tasks into jobs and stages means that when you look at your pipeline you can see which job is currently running and which has actually failed without having to dig into the console output to work out what died.
If you want to go nuts, you can actually build containers in a repo in a pipeline, do some assurance and setting on your container, and stash the built image in the image registry in the repo for reuse in other pipelines.
Actually, maybe I spoke too fast…
I have an issue with the network. Containers cannot communicate with each others… Maybe I can play with the network options (network_mode?).
If a docker expert is here, I would love some help :)
When it comes to network issues I added database to services section. This is funny because I had one build that passed without problems before the test runner started complaining about the inability to connect to the database. Here's entire project for reference: gitlab.com/gonczor/aws-simple-app/...
Good setup.
To me is only add docker-compose by apk alpine
- apk add docker-compose
Another - easie, more docker -oriented and also feee option would be using Codefresh: codefresh.io
Gitlab integration is even easier than gitlab ci.
Highly recommended.
Not quite sure what benefit you'd get from codefresh.io over using gitlabci. Gitlab CI becomes a part of your code repo, so it is under source control. It's already integrated with gitlab, so no integration necessary and the configuration is yaml files just like gitlab-ci.