A Senior Developer working mostly with PHP and JavaScript, with a bit of Python thrown in for good measure, all on Linux. My tooling is simple, it's GitLab and JetBrains where possible.
Amazingly, that's better than when I started and took over. It was a case of ftp to the first server, and just hope it didn't break stuff, but also that the files would get rsync'd to the second server. If it didn't, it needed firewall changes to allow SSH access to the server to then restart the rsync process.
Our new platform is going to do the deployments automatically using Gitlabs CI/CD stuff. Mainly because I don't want to have to keep doing it. But also because there's going to be more server nodes
Our frontend is built out on react and hugo. So we use netlify to deploy on every push to master.
We have written scripts to upload and deploy our backend code in docker containers. We have an in-house tool (its open source btw) which takes those containers and runs them on kubernetes or docker.
Tech Lead/Team Lead. Senior WebDev.
Intermediate Grade on Computer Systems-
High Grade on Web Application Development-
MBA (+Marketing+HHRR).
Studied a bit of law, economics and design
Location
Spain
Education
Higher Level Education Certificate on Web Application Development
Git with CI/CD.
It really takes like 10/15min to create a GitLab project, set the CI script and test.
Even an old project can be set on a Git project and implemented with an automated script when a merge request to Master is approved. It's not difficult (when you did once before to train yourself).
If you're on a legacy project that needs several actions when deployed, you could automate it too on CI script, but if the actions are conditional (i.e. if i push a controller override then delete server cache, otherwise don't do that) you'll need to perform this actions manually i think.
Latest comments (72)
git pullgit pullAmazingly, that's better than when I started and took over. It was a case of ftp to the first server, and just hope it didn't break stuff, but also that the files would get rsync'd to the second server. If it didn't, it needed firewall changes to allow SSH access to the server to then restart the rsync process.
Our new platform is going to do the deployments automatically using Gitlabs CI/CD stuff. Mainly because I don't want to have to keep doing it. But also because there's going to be more server nodes
Our frontend is built out on react and hugo. So we use netlify to deploy on every push to master.
We have written scripts to upload and deploy our backend code in docker containers. We have an in-house tool (its open source btw) which takes those containers and runs them on kubernetes or docker.
Done both manual objects deployment and CD process.
Git with CI/CD.
It really takes like 10/15min to create a GitLab project, set the CI script and test.
Even an old project can be set on a Git project and implemented with an automated script when a merge request to Master is approved. It's not difficult (when you did once before to train yourself).
If you're on a legacy project that needs several actions when deployed, you could automate it too on CI script, but if the actions are conditional (i.e. if i push a controller override then delete server cache, otherwise don't do that) you'll need to perform this actions manually i think.
Before everything we always do a PR Review.
Thereafter:
The most laravel projects:
Other projects (Drupal, WP) and/or tiny ones:
Code gets attached to a tennis ball then thrown into a hockey rink and we hope the players hit it the right direction (sometimes it's felt like that!)
Terraform + Github CD/CI > GCP
AWS + BuildKite Pipeline ( for Uploading, building and deployment)
How has your experience been with BuildKite? Do you like it?
I like it alot, easy to use and set-up.
Previously it was a bunch of scp scripting, but NOW it's a bunch of Jenkins scripting
For the current project I am building
docker build
docker push
k ... replicas=0
k ... replica=1
For the one that's in production we have CI/CD with cronos and ansible scripts