This article was originally authored by "Ismael Messa" on NumericaIdeas's blog:
Introduction
Cold Starts, which refer to the delay in starting a Lambda invocation, are a common issue encountered by Serverless platforms. In situations where low latency is essential, Cold Starts can disrupt the smooth operation of workloads. To address this problem, various strategies have been developed, including Lambda SnapStart, Provisioned Concurrency, and Custom Warmer, each with its own approach. This article aims to compare and contrast these three strategies based on various factors.
The YouTube Channels in both English (En) and French (Fr) are now accessible, feel free to subscribe by clicking here.
Lambda SnapStart
How it Works
As previously mentioned, SnapStart is a performance optimization technique designed to reduce the initialization time of a Lambda function. This strategy is fully supported by AWS and works by creating a snapshot of the function during the version release process. When the function is subsequently invoked, the cached version is reused, effectively preventing Cold Starts. By using SnapStart, Cold Starts can be improved by up to 90%.
Pricing
SnapStart is a free feature and does not require any additional cost.
Supported Runtime
It's available for only Java(11) runtime (Correto) at the moment.
Complexity to Set Up
It can be accessed via the AWS console and does not necessitate any modifications to your source code. Simply activate the feature and let it work its magic.
Limit
One implication of using SnapStart's snapshot resuming approach is that ephemeral data or credentials may not have any expiry guarantees. This means that if your code utilizes a library that generates an expiring token at the function level, it could expire when a new instance of the function is launched via SnapStart.
Moreover, if your code establishes a long-term connection to a network service during the init phase, the connection will be lost during the invocation process.
Here's a more detailed article that covers its impacts and how to set it up:
Provisioned Concurrency
How it Works
Provisioned Concurrency is an AWS feature that keeps your function warm and ready to respond in a matter of milliseconds at the scale you specify. With this feature enabled, you can select the number of instances of your function that run concurrently to handle incoming requests, rather than relying on Lambda to launch new instances as requests arrive (in-demand).
The distinctive aspect of Provisioned Concurrency is its rapid startup time, which is attributed to the fact that all setup processes, including the initialization code, occur before invocation. This ensures that the function remains in a state where your code is downloaded and the underlying container structure is configured. It is worth noting that this feature is only available with published versions or aliases of your function.
Pricing
There are additional costs related to it :
- You pay for how long provisioned capacity is active.
- You pay how many concurrent instances should be available.
Supported Runtime
It is available for all runtimes.
Complexity to Set Up
The Provisioned Concurrency option can be accessed via various channels, including the AWS Console, Lambda API, AWS CLI, AWS CloudFormation, or Application Auto Scaling, and does not require any modifications to the existing source code.
Limit
Provisioned Concurrency is not supported with Lambda@Edge.
Custom Warmer
How it Works
The Custom Warmer strategy aims to prevent Cold Starts by keeping the function warm through a pinging mechanism. This is achieved by utilizing AWS EventBridge Rules to schedule function invocations at regular intervals. By selecting a specific frequency, typically every 15 minutes, the function is automatically triggered, ensuring that it remains warm.
Also, it's generally implemented by some open-source libraries but you are free to built a custom one manually.
Pricing
No cost is required! There are no additional charges for rules using Amazon EventBridge.
Supported Runtime
You can use it with any runtime you need.
Complexity to Set Up
To implement a Warming strategy, some changes to the source code are necessary since the Warmer triggers a function invocation after a specific period.
The function needs to identify whether it is a call from the Warmer and adjust its behavior accordingly, as demonstrated in the following example:
A sample implementation is available in the following repository with the NPM script npm run job:warm:env
.
numerica-ideas / ni-microservice-nodejs
NumericaIdeas NodeJS basic microservice for server (EC2) and cloud function (Lambda) deployments.
NumericaIdeas - Microservice (NodeJS)
Architecture Diagram
This project serves as a simple micro-service usable to build highly scalable backends in an hybrid setup as part of a cloud migration, either in the cloud (Lambda or EC2) or into a custom VPS.
Let's note that for EC2 (VPS) deployment, Redis is mainly in used for inter-components communication, while AWS-SNS is used for the Lambda path.
More explanations about these architecture decisions are present in a blog post: https://blog.numericaideas.com/multipurpose-cloud-migration-nodejs
Prerequisites
- MongoDB the NoSQL database management system.
- NodeJS installed and running, version >=10 (14 is recommended).
- Redis installed and running, it's used for the caching layer and some inter-services communication.
- AWS CLI for lambda deployments using NPM Scripts.
Note: AWS CLI isn't required since the CI/CD pipeline will always auto-deploy the App on changes made on develop (dev) and master (prod) branches.
Running
First, you should install the dependencies by typing npm install
…
Limit
It is important to note that this approach does not guarantee a complete elimination of Cold Starts. For instance, if the function is behind a Load Balancer, it may not always be effective since the LB can direct traffic to instances that are not warmed. Additionally, in production environments where functions scale out to handle increased traffic, there is no assurance that the new instances will be warmed up in time.
- - - - - - -
We have just started our journey to build a network of professionals to grow even more our free knowledge-sharing community that'll give you a chance to learn interesting things about topics like cloud computing, software development, and software architectures while keeping the door open to more opportunities.
Does this speak to you? If YES, feel free to Join our Discord Server to stay in touch with the community and be part of independently organized events.
- - - - - -
If the Cloud is of interest to you this video covers the 6 most Important Concepts you should know about it:
Important: other articles are published on NumericaIdeas's blog.
Conclusion
In summary, each of these strategies has its own unique approach to mitigating Cold Starts:
- SnapStart takes advantage of the SnapShots technique.
- Custom Warmer implements a scheduled ping mechanism.
- Provisioned Concurrency uses provisioned functions instances.
It should be noted that Cold Starts are not a critical issue for most functions, as they occur in only about 1% of invocations. Nonetheless, we hope to have covered the significant differences between these strategies. If you have any suggestions or comments, please feel free to share them in the comments section below.
Thanks for reading this article, recommend and share if you enjoyed it. Follow us on Facebook, Twitter, LinkedIn for more content.
Top comments (0)