Using Docker / containers makes sense only if you need to standardize deployment pipelines. Otherwise it’s a waste of time.
When do you need to standardize anything? When you're dealing with a a large quantity.
So, either you:
- Have many (as in, double digits) microservices comprising one system, or
- Are in a very large organization with many development teams.
So if you have one or two apps running on Heroku / Beanstalk — leave them there. You’re not missing out on anything. It's pure overhead.
Top comments (27)
No, there's plenty of other use cases for containers. We actually don't use containers for deployments on one of my projects. Instead we use containers for development. Containers are a great way to package software, and that includes middlewares such as a database, cache, etc.
Can you explain? Eg. it's more convenient for you to run Postgres from a Docker image than install it?
Yes it is. No overhead of system installs and updating means just changing the version in your docker-compose.yml file.
Another advantage is you could send me your compose file and all I have to do is run it to get the exact same resources you use.
Its much easier to get everyone to install the same thing locally with containers. Its also simpler to start/stop services you need if there's more than one that your app relies on with compose.
~ Your team's dev-ops engineer. xxx :P
Most companies and devs are using Docker mainly because of unified development environment. No more "well, it works on my machine" or "oh shit I have older version of npm, sorry". For me it's mandatory to have a dockerfile, docker-compose file beside every project in repo. I can't even imagine it without Docker, that must be suicide within big projects with wide tech stack where you must run like 5-6 technologies simultaneously or more. So no it's not a waste of time, waste of time is not to have it.
One advantage is usage for development. Another advantage is that Beanstalk directly supports Docker as an environment so one could easily check the dev environment by building the image and the same would work in prod.
I usually require it a lot in Beanstalk as customization is hard and sometimes I want other defaults, like yarn and custom npm scripts. This is one of the reasons why Docker is useful in prod as well.
Like others have said, the development experience is better and no more "it worked on my machine".
But there are downsides to docker too, like access control and persistent storage. The value of containers is in orchestrators like k8s with autoscaling and self healing. Without it, other development virtualization tools like packer and vagrant are basically equivalent in my opinion.
We have used Docker container for deployment, there were two reasons for that.
We used LibreOffice to convert Docx to pdf via API, so in our application, we need LibreOffice to be installed also, we include the installation of LibreOffice in docker file, we also used puppeteer for generating pdf, for that we need headless chrome browser, that also is installed from docker file.
We bought 1 server instance, we needed to deploy multiple apps on 1 server, so multiple apps were having the same env keys such as the SENTRY key for API and WebApp, so we deployed apps on multiple docker containers so API keys, etc do not cause any issues.
Thanks for sharing.
This reminds me of my recent use case. We were building our latest book using Pandoc, and to install Pandoc + Latex the right way is such a pain in the ass, using a Docker image was 10x easier! :)
I think it makes also sense to use docker in a small company because if youre building your own build with versions youre able to easy hotswap back your container to an old version if the new build has a problem.
I've recently used them for a much smaller project with only 5 node apps. I found that being able to literally clone the repo, run docker-compose build and docker-compose up was super easy and really helped when I needed to quickly get everything on a laptop and run it at different outdoor testing sites.
Why specifically was it easier than using NVM & local npm installs?
It does not only make local development in a complex environment a lot simpler like the other comments said, but also exactly documents the requirements for an application to run. Dockerfiles are Infrastructure as Code, and instead of installing stuff around and feeling lucky, all dependencies, packages and steps for getting it up an running are versionable, reviewable and ultimately executable.
Nice insight, but shouldn't we use Docker for other reasons?
For example, it's easier to backup and restore a container and if needed move it to another machine.
You mean like have a binary that works 5 years later, even though it's become ~impossible to recreate it?
I was talking about something like this:
I don't see containers as a binary. For me is more like a simplified VM.
Both a Docker container and VM images are binaries...
I use containers for pretty much everything. The use case isn't just for CI, it's for running processes in a sandboxed and controlled/predictable way. Not overlapping data and config.
At which point have language version manager + local dependency installations stopped working for you?
it's not that local dependency installations have "failed" me per se, but the use cases in which it is largely beneficial to describe a "state of the world" textually and have docker bring that to life deterministically and having API for interacting with such environment are really nice
Instead of wasting time to figure out why your local setup doesn't work you can waste time to figure out why your local docker setup doesn't work.
Created my account just so I could 'like' this :-)
All this talk got me more interested in finding use cases to add Docker to my workflow.
Easier to do clean updates/uninstalls for smaller services.
That's one of the reasons I use Docker on home automation SBCs.