DEV Community

Najmus Saqib
Najmus Saqib

Posted on • Updated on

Setting up Bitbucket pipelines for Laravel

Bitbucket Support has a good article to start setting up your laravel project with their pipelines feature but for a large scale Laravel project, this article is not complete.

Multi stage builds

Bitbucket pipelines has all the inherent pros and cons of vendor locked-in environment. Memory limitation is one of them. Composer install step was never going to work with this limitation. For that I moved to multi stage build. I created my own custom image and pushed it to docker hub. The dockerFile looks like this

FROM php:7.2-fpm

RUN apt-get update && apt-get install -y unzip ...
RUN printf 'memory_limit = -1;' > /usr/local/etc/php/php.ini

RUN curl -sS https://getcomposer.org/installer | php -- --install-dir=/usr/local/bin --filename=composer
RUN mkdir tests
RUN echo '{                                    \
    "require": {                               \
        "laravel/framework": "7.*",            \
        ...
    },                                         \
    "require-dev": {                           \
        "phpunit/phpunit": "^8",               \
        "squizlabs/php_codesniffer": "3.*",    \
        "phpmd/phpmd": "@stable",              \
                "pdepend/pdepend" : "@stable"  \
    },                                         \
    "autoload-dev": {                          \
        "classmap": [                          \
            "tests"                            \
        ]                                      \
    }                                          \
}' > composer.json
RUN composer install --no-scripts
RUN composer dump-autoload

Now in the bitbucket-pipelines.yml file, use this custom image and few other statements to make the tests run

image: dockerhub_user/base-image:prod

pipelines:
  default:
    - step:
        script:
          - cd $BITBUCKET_CLONE_DIR && ln -s /var/www/html/vendor vendor
          - composer dump-autoload
          - vendor/bin/phpunit tests
          ...

You can also test the base image locally using following commands. You'd need to copy the dockerFile in project_root/pipelines folder

# build image
docker build -t dockerhub_user/base-image:prod -f pipeline/dockerfile .

# Run image locally with bitbucket memory constraints
docker run -it --memory=4g --memory-swap=4g --memory-swappiness=0 --entrypoint=/bin/bash -v ${PWD}:/build dockerhub_user/base-image:prod
# With volume use -v $(PWD):/build on linux

#push the image to dockerhub
docker login
docker push dockerhub_user/base-image:prod

Top comments (1)

Collapse
 
superseun99 profile image
superseun

I want to know more... Please