DEV Community

Cover image for ⚡️ 10 Ways to Use Serverless Functions

⚡️ 10 Ways to Use Serverless Functions

Nader Dabit on January 08, 2020

Cover image by @designecologist In this post, I'll show you some of the most powerful and useful ways to use serverless functions. Jump to 10 ...
Collapse
 
codenutt profile image
Jared

An issue I ran into with Lambda Image processing is the 6MB payload limit. Did you get around that somehow?

Collapse
 
dabit3 profile image
Nader Dabit

The way this is usually worked around is to use a storage service like Amazon S3. From the Lambda you can fetch the image or video and process it from within the function without having to pass the data itself as a payload.

Collapse
 
codenutt profile image
Jared

I see. Not sure that works though. The AWS Severless Image Handler template doesn't even support that. I know because I tried. I tried so hard lol. Even pulling from a S3 bucket, it still hits that limit.

Thread Thread
 
dabit3 profile image
Nader Dabit

Hmm, so the event object with a reference to the S3 image shouldn't be that large, not sure why that wouldn't work, I would have to see more details I think. For reference though, this is essentially the way I usually handle it (the general idea): github.com/dabit3/s3-triggers/blob...

Thread Thread
 
dabit3 profile image
Nader Dabit • Edited

You also could have run into the memory limit, if so you can increase it

Thread Thread
 
codenutt profile image
Jared

Thanks for the info! I tried to dig into my error log to see what the specific error was, but I deleted the Cloud Formation, so those are no longer available. I'm pretty sure it was always the error of "exceeded payload limit" and all I was doing was fetching an S3 object from a bucket and resizing it. I'm pretty sure it wasn't a memory issue because the instances never exceeded the memory limit, but I dunno.

I ended up just using a service because I couldn't get it to work, but maybe I'll keep poking around.

Thanks for sharing your method! It's very helpful.

Thread Thread
 
codenutt profile image
Jared

Do you know if your method works for images over 6Mbs?

Thread Thread
 
ionline247 profile image
Matthew Bramer • Edited

What payload were you using to invoke your lambda. The error message is indicative that the JSON is over the 256 KB limit on lambda. If the lambda is invoked by an event in S3, you then can hydrate the images that is in S3 using the payload described here:

docs.aws.amazon.com/AmazonS3/lates...

Here's the doc on creating an S3 event:

docs.aws.amazon.com/AmazonS3/lates...

Thread Thread
 
codenutt profile image
Jared

Cool. Thank you! I'm not entirely sure, but whenever I tried to resize an image that was more than 6mbs, it failed. It was pulling straight from an S3 bucket.

Thread Thread
 
ionline247 profile image
Matthew Bramer

If you can post the code in a gist somewhere, I'll help you out.

Thread Thread
 
codenutt profile image
Jared

Will do. Much appreciated

Thread Thread
 
arnaudcortisse profile image
Arnaud Cortisse

What about using streams?
You wouldn't have to download the image in its entirety in order to process it.
Wouldn't it solve your problem?

Thread Thread
 
codenutt profile image
Jared

That sounds like it may work. I'll test it out and let you know! Thanks!

Thread Thread
 
pavelloz profile image
Paweł Kowalski

Ive been unzipping 3gb+ archives from s3 using streams, so this might help. Ps. i also compress images, tested with 10mb+ so that's probably not an aws issue.

Thread Thread
 
codenutt profile image
Jared • Edited

That's good to know. Thank you! Do you have the code available for that?

Thread Thread
 
pavelloz profile image
Paweł Kowalski

Not yet. I need to clean it up and then ill be sharing it on my github - just need to find a moment to do that ;)

Thread Thread
 
codenutt profile image
Jared

Cool. Thanks!

Collapse
 
dorogoff profile image
Aleksey D. • Edited

Serverless is awesome, and I was fascinated with it..before I started to use it, with limits for functions - you can't use it properly with 3rd party APIs. Cold time, non-trivial deployment, hard debugging, or if you did something wrong - you risk spending all your budget (if you forgot to set the limits in settings). I feel that serverless is good for well-top-notch-experienced developers and teams, enterprise-level companies, not for regular artisans. IMHO for sure, I found that for me much faster, easier, cheaper create a new VPS cluster with nodejs, instead of using serverless functions.

Collapse
 
jkhaui profile image
Jordy Lee

Not really - those were my concerns initially too, but you can get around most of those issues by using the serverless framework and its plugins (e.g. serverless offline plugin). And cold start is easily rectified by setting a cloud watch function to automatically income your lambda every 15 minutes

Collapse
 
pavelloz profile image
Paweł Kowalski

Cold start can be eliminated by new feature, reserved concurrency or smth. But tbh, i prefer just optimize function code to be fast to boot.

Collapse
 
njann profile image
Nils

Lambda is also great for web scraping - in combination with CloudWatch Events even more 👍

Collapse
 
pavelloz profile image
Paweł Kowalski

Or running tests. Someone wrote how they made their tests concurrent and instead of some ridiculous time it os running within 20 seconds on 1000 lambdas ;)

Collapse
 
andrewbrown profile image
Andrew Brown 🇨🇦

Great article Nader

Collapse
 
dabit3 profile image
Nader Dabit

Thanks!!

Collapse
 
ricardo1980 profile image
Ricardo Ruiz Lopez

You should mention that events are not available if you use Aurora Serverless, which is very annoying because if you need that, this will make your backend more complex and difficult to program.