DEV Community

Cover image for The Top Serverless Announcements from AWS re:Invent 2023
Guilherme Dalla Rosa
Guilherme Dalla Rosa

Posted on

The Top Serverless Announcements from AWS re:Invent 2023

With numerous AI-related announcements, this year's re:Invent marked a significant shift towards AI. The standout was Amazon Q, which was revealed during the keynote. Amazon Q, AWS's counterpart to ChatGPT, integrates into the AWS console, AWS documentation pages, and IDEs via the VS Code plugin, AWS Toolkit. Uniquely trained on AWS documentation and immune to the restrictions of ai.txt, Amazon Q promises more current answers than ChatGPT.

Steering away from the AI buzz (if that's even possible), let's dive into the serverless realm and check groundbreaking announcements that were made and how they might revolutionise our tech toolkit.

ElastiCache "serverless"

The launch of Amazon ElastiCache Serverless marks a significant stride in AWS's serverless offerings. This new service addresses many limitations of the traditional ElastiCache, making it more user-friendly and fitting the serverless model more closely.

The key highlights include:

  • Simplified Operation: No need to choose instance types or worry about bandwidth and TPS limits.

  • Native API Support: Supports Memcache and Redis APIs, easing migration from server-based setups.

  • Multi-AZ and VPC Support: Offers built-in high availability and works with VPCs from day one.

  • Eliminated Autoscaling Groups: Autoscaling is more straightforward, although it can take time to scale up during sudden spikes.

  • Pay-per-use Pricing: A move towards a dynamic pricing strategy aligned with serverless computing principles.

While Amazon ElastiCache Serverless introduces several improvements over its predecessors, it has sparked mixed feelings within the tech community, particularly concerning its pricing and its serverless credentials.

The pricing, notably high, includes a minimum charge of $90 per month for even minimal data storage. For instance, storing just slightly over 1 GB can cost $180 monthly!. This contrasts with on-demand instances, where comparable storage is significantly cheaper.

Operational concerns also come into play. The requirement to run Lambda functions within a VPC leads to additional VPC-related expenses. Additionally, the scaling capability, which only allows doubling capacity every 10 minutes, is perceived as sluggish. This combination of high costs and operational limitations has sparked debates about the service's practicality and affordability, particularly for sporadic usage scenarios.

Lambda Scales 12x Faster

AWS Lambda has revolutionised its burst concurrency limits, significantly enhancing its scaling capabilities. Previously constrained by a region-wide burst limit of 500–3000 and a slow refill rate, Lambda now allows each function to burst to 1000 concurrent executions instantly. What's more, this limit increases by 1000 every 10 seconds, with each function scaling independently.

This change is a game-changer for scenarios with sudden traffic spikes, like flash sales. For instance, with an average request time of 100ms, a single execution can handle 10 requests per second. So, you can now burst to 10,000 requests per second per endpoint, with an additional 10,000 every 10 seconds, up to your account-level limit.

This level of scalability introduces new considerations, especially around system bottlenecks. For example, API Gateway, with its default limit of 10,000 requests per second, could now become a throttle point.

Step Functions Enhancements

AWS Step Functions has introduced several significant updates:

  1. Public HTTP Endpoints: Step Functions can now directly call any public APIs, eliminating the need for Lambda or API Gateway proxies. They utilise existing HTTP connections from EventBridge.

  2. Testing Individual States: You can test individual states in your state machine without full execution. This is facilitated by the new TestState endpoint, enabling programmatic testing.

  3. Integration with AWS App Composer: Step Functions now integrates with AWS App Composer, allowing for easy inclusion and editing of state machines within stacks.

  4. Optimised Integration with Bedrock: Enhanced support for AI app development, though Lambda remains preferable for streaming responses, especially for frontend applications.

SQS FIFO Throughput Massive Increase

AWS has dramatically increased the throughput for SQS FIFO, now enabling processing of up to 70,000 messages per second in high throughput mode. This enhancement marks a significant leap in handling large volumes of messages efficiently, catering to more demanding and high-traffic applications.

Aurora Limitless Database

Amazon Aurora has launched the Aurora Limitless Database, a significant upgrade allowing clusters to scale up to millions of write transactions per second and manage petabytes of data. While most users may not require this extreme level of scalability, the technical achievement is impressive and noteworthy.

Final thoughts

re:Invent 2023 showcased an array of remarkable advancements, particularly in the Serverless domain. From the significant scaling improvements in Lambda and SQS FIFO to the innovative features in Step Functions and the technical prowess of Aurora Limitless, AWS is pushing the boundaries of what's possible in cloud computing. While some offerings, like ElastiCache Serverless, sparked debate over pricing and operational aspects, the overall direction is clear: AWS is committed to providing more robust, scalable, and efficient solutions, driving the future of Serverless computing forward. As we embrace these changes, it's exciting to ponder how they will shape our technological landscape in the coming years.

Top comments (2)

Collapse
 
sip profile image
Dom Sipowicz
Collapse
 
sip profile image
Dom Sipowicz

Uniquely trained on AWS documentation and immune to the restrictions of ai.txt

hahah. Let's see how this ages