Demystifying containers using Crystal, Amber, and Docker
Jake Varness Feb 11, 2018
If you're like me, you have heard a lot about Docker and containerization, but haven't fully tapped into it's potential, or you aren't sure why or how it can fit your use cases.
Like with all new technologies, there comes a learning curve, and with Docker there exists a curve, but thankfully there is a TON of documentation to help reduce the curve.
I'm going to try and explain things like I'm not a super duper noob, but this post is going to assume that you know Amber and Crystal well enough to make small changes to a web application. If you need to learn more about Amber and Crystal, I would highly recommend looking at the posts that Elias Perez has made, as they're very thorough and can help you gain a very good understanding of how Amber apps are created.
Generating an Amber App
To get started, I would just recommend generating your own amber app using the
amber new command:
amber new <name_of_app>
Without providing any options to the
amber new command, a new project is generated that uses PostgreSQL and Granite as it's DB and ORM. This normally isn't so bad, but when it comes to configuring databases I typically get pretty annoyed. Throughout my career I remember running into a lot of issues trying to configure Oracle databases, and just a few weeks ago I tried installing PostgreSQL in a Linux Mint VM before I gave up and installed a Xubuntu VM.
It gets really annoying when all you want to do is code, but you have to set up an infrastructure to run your app! This is tiring, doesn't get you any closer to the ultimate goal of writing and running your application, and even if you get your environment working, it's no guarantee that it'll be exactly like the ones your clients have.
If only there was a simpler thing we could do...
Don't Download Databases, Download Docker!
Seriously, if you're at this step, save yourself the trouble of downloading PostgreSQL and trying to get it to run. Downloading Docker, if you can believe this, is going to make things easier for you right now, as well as in the long run.
I downloaded Docker onto my Mac last night, and I was able to deploy my Amber app to a container on my machine within a half hour of downloading it. Without it, I probably would have spent a good chunk of time downloading PostgreSQL and attempting to stand up a DB instance, which is time I really could have spent coding.
If you have a simple application created, and you have Docker downloaded for your platform, head to your terminal, and use the following command to get your app running:
docker-compose up -d
docker-compose up, you're asking docker to compose your application using the configuration defined in the
docker-compose.yml files that were created when you generated your app. The
-d option just tells Docker that once the images were built and they're running, run them in the background so that the running Docker process isn't still attached to your terminal session.
To see your new images running, run the command
docker container ls -a and you'll see all the containers that Docker created and ran for you:
$ docker container ls -a CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES 459baf330231 tour-of-heroes "amber watch" 4 minutes ago Up 4 minutes 0.0.0.0:3000->3000/tcp tourofheroes_app_1 cd860dfd3fc3 tour-of-heroes "bash -c 'while ! nc…" 4 minutes ago Exited (0) 4 minutes ago tourofheroes_migrate_1 b0a5fd0d4ce5 postgres "docker-entrypoint.s…" 4 minutes ago Up 4 minutes 5432/tcp tourofheroes_db_1
What Did I Just Do?
Well, you locally deployed your first Docker image using
docker-compose of course! But what did Docker do?
Take a look at the
docker-compose.yml file, and you'll probably be able to piece together what happened. There are three services defined in the
migrate services are services belonging to your Docker image, and the
db service uses the
postgres image so that you don't have to manage the db yourself. That's right: no configuring default admin users, no downloading of databases required.
db service sets up the database, you'll need to migrate your Granite models into the database... Well, YOU won't need to, because Docker already created a container that did that for you! The
migrate service will run all of the appropriate
amber migrate commands to migrate the models into the database, as well as run any seeders that have been written for your project.
Lastly, there's the
app service, which just runs your app as defined in your
Dockerfile. Once this container starts running, the application is ready to be accessed by going to
Making Changes to the App
With your Docker image running, you can actually make changes to your application and they will be reflected in your image!
I had asked Amber to generate a scaffold for me for a
Hero model, but it wasn't smart enough to know that the plural of "hero" was "heroes" and not "heros". I went around and fixed all the places where "heros" was and it was immediately reflected in the application!
Amber is also cool enough to pull in Bootstrap 3.3.7 into the application via CDN by default, so I updated the
index.slang file to show a star glyph when the hero is a "favorite" hero rather than "true" or "false".
Now, if you need to bring down the application, you can bring the application down using
docker-compose down, but beware, for invoking
down will bring down the
db containers, and any info you had saved in there before will be lost.
You can safely rebuild the the Docker image for you application and bring it up by invoking:
docker-compose up -d --no-deps --build app
This will rebuild only the application and not include any dependencies, such as the database. If you need to execute another
migrate command, you can include the
migrate at the end of the command.
If you read this, I hope you found it helpful or useful! Hopefully after reading this and maybe following along with it, you're a little more comfortable with containers. I know that going through this boosted my confidence a little!