DEV Community

Cover image for What do we use Gitlab schedule pipelines for?
Luis Serra
Luis Serra

Posted on

What do we use Gitlab schedule pipelines for?

How schedule pipelines help us implement processes

Introduction

In software development, and especially in the DevOps area, there is a common need not only to automate processes but also to trigger their execution repeatedly.

However, controlling the execution of these repetitive processes raises several questions, such as:

  • what is the right tool to perform this control?

  • where can this tool be hosted? On-premises?

  • if we use a company device, how do we ensure that it is always operational?

  • if we set up this tool in the cloud, what are the costs of operating and maintaining it?

  • what types of flows/work can it control?

  • how can we easily obtain outputs about the success or failure of the various executions?

  • among others…

One of the most common tools used in software development is a version control system. Here we have several options, such as GitHub, Bitbucket, Gitea, etc., but it’s in GitLab that we find a very interesting feature, the Schedule Pipelines.

These, as the name indicates and according to the documentation run pipelines in the future, repeatedly, for specific branches or tags, that are disputed in regular time intervals.

With these characteristics, several of the previous questions can be answered, since it already has a graphic interface, easy to consult, even by non-technical people, it can be run remotely or on-premise, it’s a feature included on a free tier and since we are talking about fully configurable pipelines, the type of flows/work they control is easy to implement.

Regarding outputs, besides the dashboard with the status of the last iteration, we also have several integrations with communication platforms such as Slack, or if you need something more customized, again there is no problem because we can create jobs to run custom notifications.

In this article, I will present to you some scenarios where Schedule Pipelines were useful and how you can set up your own pipelines.

Which workflows did we automate?

1. Database Sync

In the software development process, it’s common for there to be different development environments (development, staging, production, etc.) to which there is a similar element, data. This data will populate the applications so that programmers, QA engineers, designers, etc., can test one or many applications in a way closer to production, however, it is not always easy to have similar data in all these environments.

This issue was the first where we used the Schedule Pipelines of GitLab to help us, sync databases between the various environments. Using two scripts written in bash, we defined a process to perform a dump from production, sanitize it and restore this data to each development process environment on a weekly bases.

2. Cleaning the Container Registry

One of the cost problems we encountered was the increase in the size of a company’s container registry. Frequently, these solutions have configurations to automatically delete the oldest images, however, this configuration in Azure is only available for the most expensive SKU tier, which would increase the costs of this registry. After a search, Azure itself provides a script written in bash to delete tags older than a certain date, so after some adjustments to it, it was easy to create a flow to purge the older images from the container registry, this time on a monthly bases.

3. Container Security Check

All our docker base images of our projects are hosted in our internal container registry, however, these are based on other well known public images (node, java, python, etc.), plus some base tools and software. This approach has some advantages like creating custom base images with cross-platform software already installed, but there are always new vulnerabilities so it’s good to have some kind of vulnerability check happening once in a while to check if there are no new security breaches with the pre-installed software. To achieve this, using trivy we created a job to check our base docker images every week, looking for new vulnerabilities.

4. Rebuild Base Docker Images

Following the problem before, it’s good to once in a while recreate our docker base images, to update the latest packages that we could have been installing, so to achieve that we set up a scheduled pipeline to perform a docker build command to rebuild those images.

5. Lokalise translations synchronization

To handle our application internationalization, we use Lokalise as a collaborative translation software platform, however, we need to sync the translators’ work with our local files, to do so we have implemented a scheduled pipeline to, twice a week, download the translations from Lokalize platform and check if there is any change between the new and the old translations. If there is any, then a new merge request will be created with those diffs.

How can I use schedule pipelines then?

Using their UI is pretty simple to set up a new pipeline. On your project just go to CI/CD > Schedules > New Schedule and where you can configure everything.

What are the cool features of this? Ok, let me enumerate them so:

  • cron syntax that you can find on other systems, pretty standard
  • target branch or tag, so you can have several behaviours using different branches
  • variables to configure and enrich your pipelines
  • easy on/off feature using the active toggle

How can I configure a job to be executed only on schedule pipelines?

Using the rules keyword, you can use the predefined CI/CD variable CI_PIPELINE_SOURCE, which has the value schedule when a scheduled pipeline is triggered.

rules:
    - if: '$CI_PIPELINE_SOURCE == "schedule"
Enter fullscreen mode Exit fullscreen mode

But that approach will trigger all jobs with CI_PIPELINE_SOURCE variable rule set to schedule, am I right? So, show me the magic…

How can I select only some of them using the same ci file at the same branch?

Is that possible? And the answer is yes, with something that I talked about before, variables. Using schedule pipeline variables you can create a new var that will control which job will be executed. Below you can find a little example of this approach, where you can find two jobs at the same GitLab-ci file, at the same branch as well, but using setting up a variable OPERATION on each configuration.

...
container_scanning:
  stage: scanning
  ...
  rules:
    - if: '$CI_PIPELINE_SOURCE == "schedule" && $OPERATION == "scan"'
....
delete-old-images:
  stage: acr-clean
  ...
  rules:
    - if: '$CI_PIPELINE_SOURCE == "schedule" && $OPERATION == "acr-clean"'
Enter fullscreen mode Exit fullscreen mode

This was a short introduction to the schedule pipelines feature from Gitlab CI, and how we use them on a daily basis. I hope that those examples gave you a better understanding of which processes or flows you can automate easily in your organization. If you already use this feature, please share your use cases in the comments section 😉.


If you enjoy working on large-scale projects with global impact and if you like a real challenge, feel free to reach out to us at xgeeks! We are growing our team and you might be the next one to join this group of talented people 😉

Check out our social media channels if you want to get a sneak peek of life at xgeeks! See you soon!

Top comments (0)