DEV Community

Cover image for Introduction to Serverless Container Services
Eyal Estrin for AWS Community Builders

Posted on • Originally published at eyal-estrin.Medium

Introduction to Serverless Container Services

When developing modern applications, we almost immediately think about wrapping our application components inside Containers – it may not be the only architectural alternative, but a very common one.

Assuming our developers and DevOps teams have the required expertise to work with Containers, we still need to think about maintaining the underlying infrastructure – i.e., the Container hosts.

If our application has a steady and predictable load, and assuming we do not have experience maintaining Kubernetes clusters, and we do not need the capabilities of Kubernetes, it is time to think about an easy and stable alternative for deploying our applications on top of Containers infrastructure.

In the following blog post, I will review the alternatives of running Container workloads on top of Serverless infrastructure.

Why do we need Serverless infrastructure for running Container workloads?

Container architecture is made of a Container engine (such as Docker, CRI-O, etc.) deployed on top of a physical or virtual server, and on top of the Container engine, we deploy multiple Container images for our applications.

The diagram below shows a common Container architecture:

Image description

If we focus on the Container engine and the underlying operating system, we understand that we still need to maintain the operating system itself.

Common maintenance tasks for the operating system:

  • Make sure it has enough resources (CPU, memory, storage, and network connectivity) for running Containers
  • Make sure the operating system is fully patched and hardened from external attacks
  • Make sure our underlying infrastructure (i.e., Container host nodes), provides us with high availability in case one of the host nodes fails and needs to be replaced
  • Make sure our underlying infrastructure provides us the necessary scale our application requires (i.e., scale out or in according to application load)

Instead of having to maintain the underlying host nodes, we should look for a Serverless solution, that allows us to focus on application deployment and maintenance and decrease as much as possible the work on maintaining the infrastructure.

Comparison of Serverless Container Services

Each of the hyperscale cloud providers offers us the ability to consume a fully managed service for deploying our Container-based workloads.

Below is a comparison of AWS, Azure, and Google Cloud alternatives:

Image description

Summary

In this blog post, I have compared alternatives for deploying microservice architecture on top of Serverless Container services offered by AWS, Azure, and GCP.

While designing your next application based on microservice architecture, and assuming you don't need a full-blown Kubernetes cluster (with all of its features and complexities), consider using Serverless Container service.

References

About the Author

Eyal Estrin is a cloud and information security architect, the owner of the blog Security & Cloud 24/7 and the author of the book Cloud Security Handbook, with more than 20 years in the IT industry.

Eyal is an AWS Community Builder since 2020.

You can connect with him on Twitter

Opinions are his own and not the views of his employer.

Top comments (0)