As engineering teams scale and software complexity grow, having a robust, automated CI/CD pipeline is no longer a luxury; it's a necessity. In this article, I’ll take you through the complete journey of setting up a fully private, self-hosted CI/CD pipeline using GitLab CI/CD, Docker-based runners, and SonarQube for code quality analysis.
This isn't another theoretical guide. Everything here is built in a real-world scenario, inside an isolated internal network with zero reliance on public SaaS.
Why Self-Hosted?
Sometimes the cloud isn’t an option. Whether due to security policies, compliance, or architecture decisions, many teams must operate entirely within internal infrastructure.
This setup solves for:
- Air-gapped environments
- Internal-only GitLab & SonarQube servers
- Fully containerized runners
- Controlled traffic and secrets
The Tech Stack
Components & Purpose:
- GitLab CE – Source control + CI/CD engine
- GitLab Runner – Executes jobs in Docker
- SonarQube – Code quality & static analysis
- SonarScanner CLI – CLI for analysis, runs in CI
We use Docker images for everything, keeping the runner nodes clean and stateless.
Project Structure & CI/CD Workflow
Sample .gitlab-ci.yml
stages:
- sonar
variables:
GIT_DEPTH: "0"
SONAR_HOST_URL: "https://sonarqube.internal"
sonarqube:
stage: sonar
image: sonarsource/sonar-scanner-cli:latest
script:
- sonar-scanner -Dsonar.token="$SONAR_TOKEN"
SONAR_TOKEN is stored as a GitLab CI/CD variable.
Key Concepts:
- stages: CI jobs are grouped into stages and run sequentially.
- image: Docker image for the job, here sonar-scanner-cli.
- script: Actual commands to execute inside the container.
If this job fails (e.g., scanner fails or SonarQube is unreachable), the pipeline will stop at this stage.
Understanding GitLab Runner
GitLab Runner is a background service that polls GitLab for jobs. Once it picks up a job:
- It downloads the project source
- Starts a Docker container with the image defined in the CI file
- Executes the job commands
- Returns logs and status to GitLab
Docker Executor
We use the Docker executor mode, which isolates each job in a clean container environment.
Tip: For internal domains, configure Docker runner with extra_hosts to resolve local services like git.internal or sonarqube.internal.
Deep Dive: How SonarQube Analysis Works
SonarQube analyzes source code and returns metrics on:
- Code smells
- Bugs
- Duplications
- Test coverage
- Security hotspots
Supported file types (in our Laravel + Vue project):
- PHP: app/, routes/, tests/
- JS/Vue: resources/js/, *.vue
Config File: sonar-project.properties
sonar.projectKey=my-project
sonar.projectName=My Project
sonar.sources=app,resources/js
sonar.tests=tests
sonar.exclusions=node_modules/**,public/**,storage/**
sonar.php.file.suffixes=.php
sonar.javascript.file.suffixes=.js,.vue
SonarScanner reads this config and sends data to the SonarQube server over HTTP. The token ensures it’s authenticated.
Lessons Learned
- CI variables make sensitive info (tokens, secrets) safe
- GIT_DEPTH: 0 is critical for full branch history in analysis
- Each stage only runs if the previous one succeeds
- SonarScanner must be configured with a valid token sonar.login is deprecated
- Docker containers may need --add-host to resolve internal hosts
Benefits of This Setup
- Private and secure — all services are internal
- Modular — easy to extend with deploy/test/build stages
- Fast feedback for developers — bugs & smells caught early
- Language-agnostic — works for PHP, JS, Java, Python, etc. This isn’t just a pipeline it’s a foundation for DevOps culture.
What’s Next?
From here, we can expand into:
- Automated unit testing (PHPUnit, Jest, etc.)
- Docker image builds and artifact publishing
- Automatic deploys to staging/prod
- More granular SonarQube quality gates
- Pipeline optimization with cache and parallel jobs
Final Thoughts
Building a self-hosted CI/CD + SonarQube pipeline taught me more than any cloud-based service ever could. From Docker networking to GitLab internals to Sonar’s scanning engine it’s been a true DevOps deep dive.
Whether you're starting fresh or replacing legacy scripts, this setup will help your team build faster, safer, and with confidence.
If you found this useful, follow me for future deep dives into CI/CD, DevOps tooling, infrastructure, and open-source pipelines.
Top comments (0)