Building serverless apps requires quite a change in thinking; all these functions and their cold-starts can feel like a whole new world to people who are used to their servers. Fly.io tries to ease these things with a service that allows us to deploy serverless containers instead of functions.
Based on AWS Firecracker, which also powers AWS Lambda, they offer a way to run containers PaaS-style, but with a serverless-y on-demand pricing.
I tried to build a small WebSocket based chat application with it, so you can look at what Fly.io does for you!
Serverless Container Chat
TL;DR You can find the code to the example app on GitHub.
We will build a simple chat app based on Node.js, Express, and Socket.IO. All deployed into a Fly.io container!
Prerequisites
To follow this tutorial, you need the following:
Setup a Node.js Project
We can use NPM to create a new Node.js project:
$ mkdir flyio-websockets
$ cd flyio-websockets
$ npm init -y
Install NPM Packages
Since we will be using Express and Socket.IO, we have to install them first:
$ npm i express socket.io
Implement the Backend
To implement the backend we have to create an index.js
file that contains the following code:
const http = require("http");
const express = require("express");
const socketIo = require("socket.io");
const app = express();
const httpServer = http.createServer(app);
const wsSever = socketIo(httpServer);
app.get("/", (request, response) =>
response.sendFile(__dirname + "/index.html")
);
let userNumber = 1;
wsSever.on("connection", (socket) => {
const userId = "User-" + userNumber++;
socket.on("msg", (msg) =>
wsSever.emit("msg", userId + ": " + msg)
);
});
httpServer.listen(8080);
Less than 20 lines of code are needed to create the backend. First, we set up the HTTP and WebSocket server and then tell them what to do.
The HTTP server will deliver the Socket.IO client library and an index.html
file that will act as our client later.
The WebSocket server will wait for WebSocket connections and pass on every msg
event it gets to all clients. It will also add a user ID to the messages, so the clients can see who wrote it.
Then we listen on port 8080
.
Implement the Client
For the client, which will be delivered by the HTTP server, we have to create an index.html
file that should contain this code:
<!DOCTYPE html>
<title>Serverless Container Chat</title>
<form>Say Something: <input /></form>
<hr>
<div></div>
<script src="/socket.io/socket.io.js"></script>
<script>
const socket = io();
const form = document.querySelector("form")
const input = document.querySelector("input")
const div = document.querySelector("div");
form.addEventListener("submit", (event) => {
event.preventDefault();
socket.emit("msg", input.value);
input.value = "";
return false;
});
socket.on("msg", (msg) => {
div.innerHTML += `${msg}<hr>`;
});
</script>
As you can see, the HTTP server also hosts the Socket.IO client library to include it right away with no external dependencies.
Adding a Start Script
It is not strictly needed to try our current setup, but later the Fly.io service will use the NPM start script, so let's create one. This way, we can check locally if everything works as intended.
The package.json
should contain the following code:
{
"name": "flyio-websockets",
"version": "1.0.0",
"description": "",
"main": "index.js",
"scripts": {
"start": "node ."
},
"keywords": [],
"author": "",
"license": "ISC",
"dependencies": {
"express": "^4.17.1",
"socket.io": "^2.3.0"
}
}
Test the App Locally
To run the app, we can now use our NPM script!
$ npm start
We can access the app in a browser under http://localhost:8080/ and see that everything is working.
Add Fly.io Support
Now that we have a self-contained Node.js app, we can add Fly.io support by creating a fly.toml
configuration file that contains information for our deployment.
app = "flyio-websockets"
[build]
builtin = "node"
[[services]]
internal_port = 8080
protocol = "tcp"
[services.concurrency]
hard_limit = 25
soft_limit = 20
[[services.ports]]
handlers = ["http"]
port = "80"
[[services.ports]]
handlers = ["tls", "http"]
port = "443"
[[services.tcp_checks]]
interval = 10000
timeout = 2000
Since Fly.io will deploy your app on <APP_NAME>.fly.dev
, you have to replace the flyio-websockets
with the name of your choice.
The build
part will tell Fly.io that we run a Node.js application. The fly CLI will bundle your project up and upload it to the Fly.io service, which will try to install our dependencies and put everything into a Docker image that will become the base for our deployment.
The services
part defines the container-specific port we used in our Node.js app. With services.ports
we tell Fly.io how we want to access this port from the outside.
Then there are some additional configurations for scaling and health checks.
That's all there is to it—just a small TOML file.
Deploy the App
Now, we're finally ready to deploy our application to Fly.io.
You can deploy with the following command:
$ fly deploy
The command will look into the fly.toml
, upload our files, and tell the Fly.io service to build and run everything.
After a few seconds, the command should finish, and we can run the status command.
$ fly status
The output of the status command should include the URL our app is hosted at.
Serverless Container as an Alternative
The serverless paradigm helps us to minimize operational overhead in the long run. But since everyone and their dog "has a problem that doesn't fit serverless," containers won't go away soon.
Fly.io seems to be a good offering in that space, and when it gets support in IaC tools, I think it can save quite some time for folks out there.
What do you think? Do you want more Heroku-like offerings or is serverless going in the right direction? Tell me in the comments!
Top comments (8)
AFAIK fly.io is not serverless.
Depends on the definition.
If you say serverless is function as a service, then Fly isn't serverless.
If you say serverless is on-demand pricing and automatic scaling, than Fly is serverless.
Serverless is not just on-demand pricing. Serverless in my understanding is the compute comes up does the work and is removed which enabled per-use or per invocation pricing in very simple tems.
Very late to the party but serverless scales to 0, meaning if its not doing anything you pay $0 (other than storage costs).
Great article! By the way, any particular reason you choose socket.io instead of ws? I’m weighing the two options for my next project.
Because I already knew it and I wanted to write that article in under an hour, haha.
You could also look into Socket-Cluster, heard good things about it.
Depending on the requirements you could also look into managed services like AWS AppSync and Google Firebase.
Thanks for your recommendation!
I ended up using Socket.IO. It feels like it have enough built-in features that I'll need on my project. ws feels like too low-level and I suppose I'll have to develop a lot of boilerplate for my use-case.
The extra performance boost is nice from ws, but I suppose I wouldn't need it in the current state of the project.
Anyway, thanks again!
i like to pass the
ws
instance in with the user object for recall later github.com/nerdfiles/chatServer/bl...