Follow me on Twitter, happy to take your suggestions on topics or improvements /Chris
There are quite a few things you will need to do as part of your Docker Workflow. You will spend a lot of your time at the terminal and a lot of your time authoring Dockerfiles and/or docker-compose.yaml. Luckily there exist an extension that can greatly help with that all of the above as well as deploying to the Cloud
Your Docker workflow
There are some actions we keep on doing when dealing with Docker. Those are:
- Authoring a Dockerfile or docker-compose.yml
- Managing, here we do everything from managing, tagging, pushing it to a repo and much more
- Running/starting/stopping/ removing your container/s, there are quite a few movements involved here if we do this to every container/image. Luckily we have Docker Compose that can operate on groups.
- Deploying your Docker Image to some sort of registry like Docker Hub or somewhere in the Cloud
- Take it to production, this can be done on premise as well as using some sort of Cloud solution
We are likely to spend a lot of our time in the terminal unless we have something like Docker Kitematic at our disposal or some similar tool.
Resources
Below is a set of resources so you deepen your Docker knowledge but also deal with Docker in the context of the Cloud:
- Sign up for a free Azure account To work with containers in the Cloud and Deploy to the Cloud you will need a free Azure account
- The Docker extension we are describing in this article It does a lot of things for you like authoring, managing, deploying, well worth installing if you are serious about Docker
- 5 part Docker series This series really covers most things Dockers, basic concepts like Images, Containers, Networks, Volumes and so on
- Containers in the Cloud Great overview page that shows what else there is to know about containers in the Cloud
- Deploying your containers in the Cloud Tutorial that shows how easy it is to leverage your existing Docker skill and get your services running in the Cloud
- Creating a container registry Your Docker images can be in Docker Hub but also in a Container Registry in the Cloud. Wouldn't it be great to store your images somewhere and actually be able to create a service from that Registry in a matter of minutes?
- Creating a GraphQL API using Microservices and Serverless This show containerization with Docker and using GraphQL
- Deploying Microservices and a GraphQL API to the Cloud This shows how you can push your Docker containers to the Cloud as well as creating a Serverless function and deploy that as well
Docker extension
The point of this article is to present a Visual Studio Code Extension that can really help your workflow. So what can it do?
- Authoring, it helps with generating Dockerfiles as well as Docker Compose files. Furthermore, it helps you do autocomplete and even lints your file and much more.
- Manage, It comes loaded with a set of commands that helps with everything from file generation to managing your images and your containers
- Browsing repositories, it allows you to browse your Docker Hub as well as container registries in the Cloud
- Deploy to the Cloud, The tool enables you to deploy to the Cloud in one click, just select your image and there you are, as simple as you want the deployment to be
Install
We install this like we would install any extension. We open up our Visual Studio Code and press the extension button and type Docker
, like so:
Authoring
There are two ways we can go about this:
- Create our
Dockerfile
ordocker-compose.yml
file and start authoring - Have the extension generate the file for us Let's show the latter.
Generate files
Bring up your command menu CMD +SHIFT + P
on a Mac and start typing Docker
. It should show you this
Select Add Docker Files to Workspace
Then we are prompted with the following Select platform
We go with Node.js
cause that's what we are trying to build. If you have a Go or .Net Core project, select that instead.
Lastly we are asked to select a port, we go with the default suggested 3000
.
What dialogs you need to through after selecting platform might differ per choice of platform
The generated files
Ok then, what did we get from this?
We got:
- docker-compose.yml
- docker-compose.debug.yml
- Dockerfile
- .dockerignore
Not just the files but loaded with content.
Dockerfile
Let's look at the Dockerfile
for example
FROM node:10.13-alpine
ENV NODE_ENV production
WORKDIR /usr/src/app
COPY ["package.json", "package-lock.json*", "npm-shrinkwrap.json*", "./"]
RUN npm install --production --silent && mv node_modules ../
COPY . .
EXPOSE 3000
CMD npm start
Above we see that everything is done for us. It has
- Selected an image
- Set an env variable
- Set a workdir
-
Copied
package.json
andpackage-lock.json
- Installed our the libraries
- Copied our application files
- Exposed a port
- Issued a command that will start our app up in the container
Quite impressive !. Of course, we still need to author our app
docker-compose.yml
Let's look at the docker-compose.yml
file next:
version: '2.1'
services:
articles:
image: articles
build: .
environment:
NODE_ENV: production
ports:
- 3000:3000
It has set everything up in terms of how to build the image, set an environment variable and mapped a port
docker-compose.debug.yml
This gives us a very similar looking file as that of docker-compose.yml
but with the difference of it running node
in inspect mode, like so command: node --inspect index.js
.dockerignore
This file contains a lot of good patterns that match files that we don't want to copy over like node_modules
, .git
, .env
, Dockerfile
. You might want to adjust this file to fit your needs.
Authoring with autocomplete
Ok. Let's look at a scenario where we do everything from scratch. Let's start off by creating a Dockerfile
.
Let's start typing FROM
. As you can see below we get help with typing the command and what it should look like
We keep on typing the name of our baseImage
, in this case, we are looking for Node.js
image so we start typing the character n
. Below we get a list of options matching what we are typing. It lists the base images by popularity and also adds some useful information so we understand what we are getting:
Next thing we try to type is ENV
but we only get as far as E
before it starts suggesting what command we are writing and how to type it:
As you can see it's quite helpful in how we should type the command.
Next up is WORKDIR
and it shows us:
Not only is it telling is how to type the command but tells us that it affects commands like COPY
and ADD
etc.
At this point we want to tell it to copy some files we might need before running commands like installing a library:
This gives us the two different ways in which we can copy things relative
or absolute
.
As mentioned we want to run a command so we can install things. Our auto complete tells us the following:
Again it suggests what kind of commands that might be.
This far in the Dockerfile
we might want to COPY
our application files and we've already shown you how to use the autocomplete for that so let's look at EXPOSE
:
As you can see it shows all the different ways in which you can export a port, really educational.
Ok, one more command is usually needed at this point either we use a CMD
or ENTRYPOINT
to start up our app in the container:
Manage
We will use the Command Palette here. It is almost a ridiculous amount of commands that it lets us invoke. Let's try to mention them by topic though. The Command Palette consists of a long long list of Docker commands. We can
- Build Docker image image
- Run a container
- See logs
- Stop/Remove a container
- Stop/Remove an image
- Show logs and much much more..
Let's focus on getting an app up and running.
Build the app
Ok, this is a really simple app so let's turn it into a Node.js app by going to the terminal and run
npm init -y
then run:
npm install express
followed by adding the following to app.js
:
const express = require('express')
const app = express()
const port = 3000
app.get('/', (req, res) => res.send('Hello World!'))
app.listen(port, () => console.log(`Example app listening on port ${PORT}!`))
lastly update package.json
by adding the following to scripts
:
"start": "node.app.js"
Now we are ready!
For all below commands, bring up the command palette with
View / Command Palette
from the menu or invoke the short command, for Mac it isCMD + SHIFT + P
Build the image
Start typing Docker: Build
, the autocomplete will narrow down the choices. Invoke the suggested command.
This will ask us if we want to use the Dockerfile where we are standing and what to tag the image with. After we've done our choices it set's about pulling down the base images and carries out all the commands in the Dockerfile
.
Once it's done you should be able to see the newly built image by typing docker images
and look for the tag name that you gave it, it should be listed on top.
Run the image
Start typing Docker: Run
and take the command it suggests. This will give you a list of Docker images you could run. Looking at the command it invokes in the terminal it looks like so: docker run --rm -d -p 3000:3000/tcp articles:latest
Docker Compose
Of course, we can leverage the power of Docker Compose, both up and down.
Start typing Docker: Compose Up
, this will create the Docker images the first time it's run followed by running the containers. Verify this with docker ps
. Additionally, we have Docker: Compose Down
and Docker Compose Restart
.
Browse repositories
At the bottom of your action bar, you should have an icon that looks like a Docker whale. Clicking that and you should be faced with:
As you can see above you can view all the images on your machine but you can also look in different registries such as Docker Hub, Azure and if you've added any private registries. To use the Azure one you would need the Azure Account
extension installed. Once that is installed you should see something like this:
There are more commands we can carry out if we right-click on an a Docker image in our container registry in Azure:
As you can see we can look at our resource in the portal. We can remove the entire repository but we can also PULL down whatever is there to our local machine.
Deploy to Cloud
There is one way to deploy to the Cloud:
- Deploy from Docker Hub
This article Article covering this extension says deployment from Container Registry should be possible. I'm sure it is, I just couldn't figure it out how to do it from the extension. I will update the articles as soon as I do figure it out.
Anyway, to Deploy from Docker Hub you just need to log in to it and right click your Docker image and select Deploy, like so:
Summary
We've shown you a lot of things you can do with this Visual Studio Code extension. You can manage your images, containers and do all sort of things with them like build them, run them, see the logs and even bring them to the Cloud.
I hope you found this useful and that you give the extension a go.
Top comments (0)