DEV Community

Cover image for Easy Serverless Containers with
K for Fullstack Frontend

Posted on • Updated on

Easy Serverless Containers with

Building serverless apps requires quite a change in thinking; all these functions and their cold-starts can feel like a whole new world to people who are used to their servers. tries to ease these things with a service that allows us to deploy serverless containers instead of functions.

Based on AWS Firecracker, which also powers AWS Lambda, they offer a way to run containers PaaS-style, but with a serverless-y on-demand pricing.

I tried to build a small WebSocket based chat application with it, so you can look at what does for you!

Serverless Container Chat

TL;DR You can find the code to the example app on GitHub.

We will build a simple chat app based on Node.js, Express, and Socket.IO. All deployed into a container!


To follow this tutorial, you need the following:

Setup a Node.js Project

We can use NPM to create a new Node.js project:

$ mkdir flyio-websockets
$ cd flyio-websockets
$ npm init -y
Enter fullscreen mode Exit fullscreen mode

Install NPM Packages

Since we will be using Express and Socket.IO, we have to install them first:

$ npm i express
Enter fullscreen mode Exit fullscreen mode

Implement the Backend

To implement the backend we have to create an index.js file that contains the following code:

const http = require("http");
const express = require("express");
const socketIo = require("");

const app = express();
const httpServer = http.createServer(app);
const wsSever = socketIo(httpServer);

app.get("/", (request, response) =>
  response.sendFile(__dirname + "/index.html")

let userNumber = 1;
wsSever.on("connection", (socket) => {
  const userId = "User-" + userNumber++;
  socket.on("msg", (msg) =>
    wsSever.emit("msg", userId + ": " + msg)

Enter fullscreen mode Exit fullscreen mode

Less than 20 lines of code are needed to create the backend. First, we set up the HTTP and WebSocket server and then tell them what to do.

The HTTP server will deliver the Socket.IO client library and an index.html file that will act as our client later.

The WebSocket server will wait for WebSocket connections and pass on every msg event it gets to all clients. It will also add a user ID to the messages, so the clients can see who wrote it.

Then we listen on port 8080.

Implement the Client

For the client, which will be delivered by the HTTP server, we have to create an index.html file that should contain this code:

<!DOCTYPE html>

<title>Serverless Container Chat</title>

<form>Say Something: <input /></form>

<script src="/"></script>

  const socket = io();

  const form = document.querySelector("form")
  const input = document.querySelector("input")
  const div = document.querySelector("div");

  form.addEventListener("submit", (event) => {
    socket.emit("msg", input.value);
    input.value = "";
    return false;

  socket.on("msg", (msg) => {
    div.innerHTML += `${msg}<hr>`;
Enter fullscreen mode Exit fullscreen mode

As you can see, the HTTP server also hosts the Socket.IO client library to include it right away with no external dependencies.

Adding a Start Script

It is not strictly needed to try our current setup, but later the service will use the NPM start script, so let's create one. This way, we can check locally if everything works as intended.

The package.json should contain the following code:

  "name": "flyio-websockets",
  "version": "1.0.0",
  "description": "",
  "main": "index.js",
  "scripts": {
    "start": "node ."
  "keywords": [],
  "author": "",
  "license": "ISC",
  "dependencies": {
    "express": "^4.17.1",
    "": "^2.3.0"
Enter fullscreen mode Exit fullscreen mode

Test the App Locally

To run the app, we can now use our NPM script!

$ npm start
Enter fullscreen mode Exit fullscreen mode

We can access the app in a browser under http://localhost:8080/ and see that everything is working.

Add Support

Now that we have a self-contained Node.js app, we can add support by creating a fly.toml configuration file that contains information for our deployment.

app = "flyio-websockets"

  builtin = "node"

  internal_port = 8080
  protocol = "tcp"

    hard_limit = 25
    soft_limit = 20

    handlers = ["http"]
    port = "80"

    handlers = ["tls", "http"]
    port = "443"

    interval = 10000
    timeout = 2000
Enter fullscreen mode Exit fullscreen mode

Since will deploy your app on <APP_NAME>, you have to replace the flyio-websockets with the name of your choice.

The build part will tell that we run a Node.js application. The fly CLI will bundle your project up and upload it to the service, which will try to install our dependencies and put everything into a Docker image that will become the base for our deployment.

The services part defines the container-specific port we used in our Node.js app. With services.ports we tell how we want to access this port from the outside.

Then there are some additional configurations for scaling and health checks.

That's all there is to it—just a small TOML file.

Deploy the App

Now, we're finally ready to deploy our application to

You can deploy with the following command:

$ fly deploy
Enter fullscreen mode Exit fullscreen mode

The command will look into the fly.toml, upload our files, and tell the service to build and run everything.

After a few seconds, the command should finish, and we can run the status command.

$ fly status
Enter fullscreen mode Exit fullscreen mode

The output of the status command should include the URL our app is hosted at.

Serverless Container as an Alternative

The serverless paradigm helps us to minimize operational overhead in the long run. But since everyone and their dog "has a problem that doesn't fit serverless," containers won't go away soon. seems to be a good offering in that space, and when it gets support in IaC tools, I think it can save quite some time for folks out there.

What do you think? Do you want more Heroku-like offerings or is serverless going in the right direction? Tell me in the comments!

Top comments (8)

geshan profile image
Geshan Manandhar

AFAIK is not serverless.

kayis profile image

Depends on the definition.

If you say serverless is function as a service, then Fly isn't serverless.

If you say serverless is on-demand pricing and automatic scaling, than Fly is serverless.

geshan profile image
Geshan Manandhar

Serverless is not just on-demand pricing. Serverless in my understanding is the compute comes up does the work and is removed which enabled per-use or per invocation pricing in very simple tems.

Thread Thread
garretharp profile image

Very late to the party but serverless scales to 0, meaning if its not doing anything you pay $0 (other than storage costs).

anzhari profile image
Anzhari Purnomo

Great article! By the way, any particular reason you choose instead of ws? I’m weighing the two options for my next project.

kayis profile image
K • Edited

Because I already knew it and I wanted to write that article in under an hour, haha.

You could also look into Socket-Cluster, heard good things about it.

Depending on the requirements you could also look into managed services like AWS AppSync and Google Firebase.

anzhari profile image
Anzhari Purnomo

Thanks for your recommendation!

I ended up using Socket.IO. It feels like it have enough built-in features that I'll need on my project. ws feels like too low-level and I suppose I'll have to develop a lot of boilerplate for my use-case.

The extra performance boost is nice from ws, but I suppose I wouldn't need it in the current state of the project.

Anyway, thanks again!

dualyticalchemy profile image
⚫️ nothingness negates itself • Edited

i like to pass the ws instance in with the user object for recall later