AWS Lambda + Rust

rad_val_ profile image Valentin Radu ・4 min read

This article was first published on techpilot.dev
TL;DR: There's an example repo here for those who want to skip the story mode

Rust piqued my interest when I found out it consistently ranked first in the StackOverflow's annual developer survey for the world's most loved programming language. Here's the 2020 survey, but it also holds the first position for 2019, 2018, 2017, and 2016.

It turns out it's as awesome as they say and now I'm in that particular moment in the hype phase when I try to do everything in Rust. I know that's a terrible idea and I strongly advise against it: pick the language that has the strongest support (aka libraries, community) for the problem you're trying to solve. Doing ML in Rust when Python is de facto standard is not such a great idea.

Anyways, I figured I can make an exception and since I'm not that excited about any of the popular backend languages, I started to experiment in that direction.

Running a Rust HTTP server using Rocket is really easy and well documented, however, if you plan to go serverless, there's still a lot of uncharted territories.

For AWS Lambda, there are a couple of resources out there, but many are outdated or somehow incomplete.

Here are the main steps we'll have to follow:

  • implement the lambda handlers
  • (cross)compile our code for the Amazon Linux platform (x86, 64bit)
  • build each lambda as a standalone binary
  • configure AWS Lambda for deployment
  • deploy & enjoy

So, let's get started!

# create a new crate
cargo new rust_aws --bin
# delete the main.rs, we'll be using a binary for each lambda
cd rust_aws && rm src/main.rs
# these are the two lambdas we're going implement
touch src/comment.rs
touch src/contact.rs
Enter fullscreen mode Exit fullscreen mode

Next, our dependencies in Cargo.toml

name = "your_proj_name"
version = "0.1.0"
authors = ["You <you@example.com>"]
edition = "2018"

lambda_runtime = "0.2.1"
lambda_http = "0.1.1"
tokio = { version = "^0.3", features = ["full"] }

name = "comment"
path = "src/comment.rs"

name = "contact"
path = "src/contact.rs"
Enter fullscreen mode Exit fullscreen mode

A couple of things to mention here.
First, we have the lambda_runtime and lambda_http crates which are responsible for communicating with the Lambda API. This usually means running the setup code, fetching the handler name from an environment variable, and passing events to our code. You can find out more about how custom runtimes work here.

Although lambdas are stateless, AWS can run our binaries and send multiple events to the same process, as long as the process doesn't exit. This requires an event loop: a fancy way to handle asynchronous I/O and scheduling. We use tokio for that.

Finally, we declared 2 different binaries, one named comment, the other contact and each will be deployed as a standalone lambda function

Next up, compilation. Unless you're on an x86, 64bit Linux machine, you'll have to cross-compile your code. To do so, we need the correct toolchain:

# adds the x86 64 target to the toolchain
rustup target add x86_64-unknown-linux-musl
# installs the x86 64 toolchain on macOS (for Windows, you can probably do it with cygwin-gcc-linux, but I haven't tried it out)
brew install FiloSottile/musl-cross/musl-cross
Enter fullscreen mode Exit fullscreen mode

Lastly, we need to let cargo know we're cross-compiling: add the following in ./.cargo/config.toml

linker = "x86_64-linux-musl-gcc"
Enter fullscreen mode Exit fullscreen mode

Now we're ready to compile. cargo build --target x86_64-unknown-linux-musl to test it out.

The next thing we need to do is to configure SAM. I'll assume you're already familiar with SAM and focus only on the critical section for our case. You can have a look at the full template.yml in the example repository. Also, skip sam init since there is no Rust template available anyhow (to my knowledge) and simply start with the template.yml file and build your own directory structure.

Let's go through the one of the lambdas' definition:

# template.yml
    Type: AWS::Serverless::Function
    FunctionName: Comment
    Handler: doesnt.matter.the.runtime.is.custom
    Runtime: provided
    MemorySize: 128
    Timeout: 10
    CodeUri: .
        - AWSLambdaBasicExecutionRole
    Type: Api
    Path: /comment
    Method: post
Enter fullscreen mode Exit fullscreen mode

This tells SAM to create a lambda serverless function named Comment, with a custom runtime (handled by or Rust lambda_runtime) and expose it as a REST API resource at /comment.

We're almost done. One last (important) thing: when we build and deploy our lambdas with sam build && sam deploy --guided SAM will look for a Makefile since it doesn't know how to build our project by itself.

touch Makefile
Enter fullscreen mode Exit fullscreen mode
    cargo build --bin comment --release --target x86_64-unknown-linux-musl
    cp ./target/x86_64-unknown-linux-musl/release/comment $(ARTIFACTS_DIR)/bootstrap

    cargo build --bin contact --release --target x86_64-unknown-linux-musl
    cp ./target/x86_64-unknown-linux-musl/release/contact $(ARTIFACTS_DIR)/bootstrap
Enter fullscreen mode Exit fullscreen mode

The way this works is straightforward, you need to add a target for each lambda name and prefix it with build-. That's it. SAM will invoke them as needed.
Each target builds the respective binary (--bin contact) and copies it in the artifacts directory, where it will be zipped and sent to the AWS servers for deployment.

And that's it. We're done. Have fun with your new Rust-powered AWS lambdas!


Editor guide
ksnyde profile image
Ken Snyder

Thanks for the article. I'm completely new to RUST but have been using Lambda for years and one thing I'd be interested in is whether RUST performs better in AWS Lambda. I've not seen any information on this and this benefit might be the reason why people would switch away from more easily achieved Javascript/Typescript lambdas.

ksnyde profile image
Ken Snyder

I guess to add onto this ... if not performance ... why? I'm not questioning RUST in this use case as much as just wondering why folks would take on a less known language that has less direct support and examples to work off of.

rad_val_ profile image
Valentin Radu Author

Thanks for reading! I didn't test the performance. I suppose it's better than NodeJS since it eliminates some of VM overhead, however, I also feel that lambdas are many times slow for other reasons related to how they are orchestrated, and unrelated to the hosted function itself (again, I haven't tested this either, just a hunch from experience)

In any case, I'm pretty new to Rust as well and the reason I did this is because I like it as a language and (very important) its ecosystem too: it's a pleasure to use.

I realize this might not scale well talent-wise (aka, if you start a Rust project you automatically reduce the talent pool by 80%), but, for my particular use case that's not such a problem.